Oct 01 12:37:46 crc systemd[1]: Starting Kubernetes Kubelet... Oct 01 12:37:46 crc restorecon[4735]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:46 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:47 crc restorecon[4735]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:47 crc restorecon[4735]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 01 12:37:48 crc kubenswrapper[4913]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 12:37:48 crc kubenswrapper[4913]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 01 12:37:48 crc kubenswrapper[4913]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 12:37:48 crc kubenswrapper[4913]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 12:37:48 crc kubenswrapper[4913]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 01 12:37:48 crc kubenswrapper[4913]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.555007 4913 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563050 4913 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563087 4913 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563092 4913 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563096 4913 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563100 4913 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563105 4913 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563110 4913 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563114 4913 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563118 4913 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563122 4913 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563126 4913 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563132 4913 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563140 4913 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563145 4913 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563150 4913 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563155 4913 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563159 4913 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563163 4913 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563166 4913 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563170 4913 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563174 4913 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563178 4913 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563183 4913 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563189 4913 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563192 4913 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563196 4913 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563200 4913 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563204 4913 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563208 4913 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563211 4913 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563215 4913 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563225 4913 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563229 4913 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563233 4913 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563237 4913 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563241 4913 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563244 4913 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563249 4913 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563254 4913 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563258 4913 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563263 4913 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563289 4913 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563293 4913 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563296 4913 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563301 4913 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563305 4913 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563309 4913 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563313 4913 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563316 4913 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563320 4913 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563323 4913 feature_gate.go:330] unrecognized feature gate: Example Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563327 4913 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563330 4913 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563333 4913 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563337 4913 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563340 4913 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563343 4913 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563347 4913 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563350 4913 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563354 4913 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563358 4913 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563361 4913 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563364 4913 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563369 4913 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563373 4913 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563376 4913 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563381 4913 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563384 4913 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563388 4913 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563391 4913 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.563395 4913 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563490 4913 flags.go:64] FLAG: --address="0.0.0.0" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563500 4913 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563508 4913 flags.go:64] FLAG: --anonymous-auth="true" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563515 4913 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563520 4913 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563527 4913 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563533 4913 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563539 4913 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563543 4913 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563548 4913 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563553 4913 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563558 4913 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563563 4913 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563568 4913 flags.go:64] FLAG: --cgroup-root="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563573 4913 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563577 4913 flags.go:64] FLAG: --client-ca-file="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563581 4913 flags.go:64] FLAG: --cloud-config="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563585 4913 flags.go:64] FLAG: --cloud-provider="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563589 4913 flags.go:64] FLAG: --cluster-dns="[]" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563594 4913 flags.go:64] FLAG: --cluster-domain="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563598 4913 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563602 4913 flags.go:64] FLAG: --config-dir="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563605 4913 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563610 4913 flags.go:64] FLAG: --container-log-max-files="5" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563620 4913 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563624 4913 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563628 4913 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563633 4913 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563637 4913 flags.go:64] FLAG: --contention-profiling="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563642 4913 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563645 4913 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563650 4913 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563653 4913 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563659 4913 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563663 4913 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563667 4913 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563670 4913 flags.go:64] FLAG: --enable-load-reader="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563675 4913 flags.go:64] FLAG: --enable-server="true" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563679 4913 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563685 4913 flags.go:64] FLAG: --event-burst="100" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563689 4913 flags.go:64] FLAG: --event-qps="50" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563693 4913 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563697 4913 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563701 4913 flags.go:64] FLAG: --eviction-hard="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563706 4913 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563712 4913 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563716 4913 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563721 4913 flags.go:64] FLAG: --eviction-soft="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563725 4913 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563729 4913 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563733 4913 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563737 4913 flags.go:64] FLAG: --experimental-mounter-path="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563740 4913 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563744 4913 flags.go:64] FLAG: --fail-swap-on="true" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563748 4913 flags.go:64] FLAG: --feature-gates="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563754 4913 flags.go:64] FLAG: --file-check-frequency="20s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563758 4913 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563762 4913 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563766 4913 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563770 4913 flags.go:64] FLAG: --healthz-port="10248" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563774 4913 flags.go:64] FLAG: --help="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563778 4913 flags.go:64] FLAG: --hostname-override="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563782 4913 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563787 4913 flags.go:64] FLAG: --http-check-frequency="20s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563791 4913 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563795 4913 flags.go:64] FLAG: --image-credential-provider-config="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563799 4913 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563803 4913 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563808 4913 flags.go:64] FLAG: --image-service-endpoint="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563812 4913 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563816 4913 flags.go:64] FLAG: --kube-api-burst="100" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563820 4913 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563825 4913 flags.go:64] FLAG: --kube-api-qps="50" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563829 4913 flags.go:64] FLAG: --kube-reserved="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563832 4913 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563836 4913 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563840 4913 flags.go:64] FLAG: --kubelet-cgroups="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563845 4913 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563849 4913 flags.go:64] FLAG: --lock-file="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563853 4913 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563857 4913 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563861 4913 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563868 4913 flags.go:64] FLAG: --log-json-split-stream="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563872 4913 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563876 4913 flags.go:64] FLAG: --log-text-split-stream="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563880 4913 flags.go:64] FLAG: --logging-format="text" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563885 4913 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563890 4913 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563895 4913 flags.go:64] FLAG: --manifest-url="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563899 4913 flags.go:64] FLAG: --manifest-url-header="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563910 4913 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563915 4913 flags.go:64] FLAG: --max-open-files="1000000" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563921 4913 flags.go:64] FLAG: --max-pods="110" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563926 4913 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563930 4913 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563935 4913 flags.go:64] FLAG: --memory-manager-policy="None" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563939 4913 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563944 4913 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563948 4913 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563953 4913 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563963 4913 flags.go:64] FLAG: --node-status-max-images="50" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563967 4913 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563972 4913 flags.go:64] FLAG: --oom-score-adj="-999" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563976 4913 flags.go:64] FLAG: --pod-cidr="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563980 4913 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563987 4913 flags.go:64] FLAG: --pod-manifest-path="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563992 4913 flags.go:64] FLAG: --pod-max-pids="-1" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.563996 4913 flags.go:64] FLAG: --pods-per-core="0" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564001 4913 flags.go:64] FLAG: --port="10250" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564005 4913 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564009 4913 flags.go:64] FLAG: --provider-id="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564014 4913 flags.go:64] FLAG: --qos-reserved="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564018 4913 flags.go:64] FLAG: --read-only-port="10255" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564022 4913 flags.go:64] FLAG: --register-node="true" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564026 4913 flags.go:64] FLAG: --register-schedulable="true" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564030 4913 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564038 4913 flags.go:64] FLAG: --registry-burst="10" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564042 4913 flags.go:64] FLAG: --registry-qps="5" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564047 4913 flags.go:64] FLAG: --reserved-cpus="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564051 4913 flags.go:64] FLAG: --reserved-memory="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564057 4913 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564061 4913 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564066 4913 flags.go:64] FLAG: --rotate-certificates="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564070 4913 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564074 4913 flags.go:64] FLAG: --runonce="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564078 4913 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564083 4913 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564087 4913 flags.go:64] FLAG: --seccomp-default="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564091 4913 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564095 4913 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564099 4913 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564104 4913 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564108 4913 flags.go:64] FLAG: --storage-driver-password="root" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564113 4913 flags.go:64] FLAG: --storage-driver-secure="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564118 4913 flags.go:64] FLAG: --storage-driver-table="stats" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564122 4913 flags.go:64] FLAG: --storage-driver-user="root" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564126 4913 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564130 4913 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564135 4913 flags.go:64] FLAG: --system-cgroups="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564140 4913 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564146 4913 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564151 4913 flags.go:64] FLAG: --tls-cert-file="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564155 4913 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564161 4913 flags.go:64] FLAG: --tls-min-version="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564165 4913 flags.go:64] FLAG: --tls-private-key-file="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564169 4913 flags.go:64] FLAG: --topology-manager-policy="none" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564174 4913 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564178 4913 flags.go:64] FLAG: --topology-manager-scope="container" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564182 4913 flags.go:64] FLAG: --v="2" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564188 4913 flags.go:64] FLAG: --version="false" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564195 4913 flags.go:64] FLAG: --vmodule="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564200 4913 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.564204 4913 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564727 4913 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564734 4913 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564739 4913 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564743 4913 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564747 4913 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564750 4913 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564754 4913 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564758 4913 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564781 4913 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564786 4913 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564790 4913 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564794 4913 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564798 4913 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564806 4913 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564809 4913 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564813 4913 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564817 4913 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564821 4913 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564825 4913 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564828 4913 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564834 4913 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564838 4913 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564843 4913 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564847 4913 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564852 4913 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564856 4913 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564860 4913 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564864 4913 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564867 4913 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564871 4913 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564874 4913 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564878 4913 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564881 4913 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564885 4913 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564888 4913 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564892 4913 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564896 4913 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564900 4913 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564905 4913 feature_gate.go:330] unrecognized feature gate: Example Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564909 4913 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564914 4913 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564918 4913 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564923 4913 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564927 4913 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564932 4913 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564941 4913 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564946 4913 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564952 4913 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564957 4913 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564962 4913 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564966 4913 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564971 4913 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564976 4913 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564981 4913 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564985 4913 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564989 4913 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564993 4913 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.564997 4913 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.565001 4913 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.565004 4913 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.565008 4913 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.565011 4913 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.565015 4913 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.565018 4913 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.565022 4913 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.565028 4913 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.565031 4913 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.565036 4913 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.565041 4913 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.565047 4913 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.565068 4913 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.565082 4913 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.579726 4913 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.579768 4913 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.579897 4913 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.579910 4913 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.579918 4913 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.579926 4913 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.579939 4913 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.579951 4913 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.579960 4913 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.579968 4913 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.579976 4913 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.579985 4913 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.579994 4913 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580002 4913 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580012 4913 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580021 4913 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580029 4913 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580038 4913 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580047 4913 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580055 4913 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580064 4913 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580074 4913 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580085 4913 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580094 4913 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580103 4913 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580111 4913 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580119 4913 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580126 4913 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580136 4913 feature_gate.go:330] unrecognized feature gate: Example Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580144 4913 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580152 4913 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580160 4913 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580168 4913 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580176 4913 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580184 4913 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580191 4913 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580201 4913 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580210 4913 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580218 4913 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580226 4913 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580234 4913 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580242 4913 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580250 4913 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580259 4913 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580295 4913 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580304 4913 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580312 4913 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580321 4913 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580329 4913 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580337 4913 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580345 4913 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580353 4913 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580361 4913 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580370 4913 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580378 4913 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580387 4913 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580394 4913 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580402 4913 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580411 4913 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580419 4913 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580430 4913 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580440 4913 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580449 4913 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580457 4913 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580466 4913 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580474 4913 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580483 4913 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580491 4913 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580499 4913 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580507 4913 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580515 4913 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580523 4913 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580532 4913 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.580546 4913 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580851 4913 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580866 4913 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580876 4913 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580885 4913 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580894 4913 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580904 4913 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580912 4913 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580920 4913 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580928 4913 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580936 4913 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580945 4913 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580953 4913 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580961 4913 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580970 4913 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580978 4913 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580988 4913 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.580998 4913 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581007 4913 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581016 4913 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581024 4913 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581032 4913 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581041 4913 feature_gate.go:330] unrecognized feature gate: Example Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581051 4913 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581061 4913 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581070 4913 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581079 4913 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581087 4913 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581097 4913 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581105 4913 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581114 4913 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581124 4913 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581133 4913 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581142 4913 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581151 4913 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581161 4913 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581170 4913 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581179 4913 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581189 4913 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581197 4913 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581206 4913 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581214 4913 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581222 4913 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581233 4913 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581243 4913 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581252 4913 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581261 4913 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581293 4913 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581302 4913 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581311 4913 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581319 4913 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581327 4913 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581335 4913 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581344 4913 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581354 4913 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581364 4913 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581372 4913 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581381 4913 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581390 4913 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581398 4913 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581407 4913 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581416 4913 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581424 4913 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581431 4913 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581440 4913 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581448 4913 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581456 4913 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581464 4913 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581472 4913 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581481 4913 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581488 4913 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.581497 4913 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.581510 4913 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.581786 4913 server.go:940] "Client rotation is on, will bootstrap in background" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.587535 4913 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.587666 4913 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.591481 4913 server.go:997] "Starting client certificate rotation" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.591529 4913 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.593527 4913 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-10 15:19:02.483047406 +0000 UTC Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.593701 4913 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1682h41m13.889349925s for next certificate rotation Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.618033 4913 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.621440 4913 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.639449 4913 log.go:25] "Validated CRI v1 runtime API" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.681620 4913 log.go:25] "Validated CRI v1 image API" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.683744 4913 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.692083 4913 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-01-12-33-37-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.692137 4913 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.713479 4913 manager.go:217] Machine: {Timestamp:2025-10-01 12:37:48.708836908 +0000 UTC m=+0.612312526 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:35eb2588-e911-475f-90ba-39d796ce691f BootID:9b5c0a1c-ebf9-497b-a54a-617d247fddf8 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:2b:8b:de Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:2b:8b:de Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:15:7b:aa Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:25:fd:d6 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:80:16:a3 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ae:c5:68 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:4d:92:72 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:62:da:9c:bf:96:2c Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:06:ce:e2:c1:c7:e1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.713749 4913 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.713937 4913 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.715362 4913 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.715571 4913 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.715617 4913 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.715848 4913 topology_manager.go:138] "Creating topology manager with none policy" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.715861 4913 container_manager_linux.go:303] "Creating device plugin manager" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.716363 4913 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.716397 4913 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.716621 4913 state_mem.go:36] "Initialized new in-memory state store" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.716750 4913 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.720619 4913 kubelet.go:418] "Attempting to sync node with API server" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.720668 4913 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.720686 4913 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.720702 4913 kubelet.go:324] "Adding apiserver pod source" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.720716 4913 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.726425 4913 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.727298 4913 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.729573 4913 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.729663 4913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Oct 01 12:37:48 crc kubenswrapper[4913]: E1001 12:37:48.729763 4913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.730319 4913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Oct 01 12:37:48 crc kubenswrapper[4913]: E1001 12:37:48.730507 4913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.731782 4913 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.731816 4913 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.731829 4913 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.731840 4913 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.731856 4913 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.731865 4913 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.731875 4913 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.731892 4913 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.731905 4913 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.731920 4913 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.731935 4913 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.731946 4913 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.732722 4913 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.733314 4913 server.go:1280] "Started kubelet" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.734317 4913 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 01 12:37:48 crc systemd[1]: Started Kubernetes Kubelet. Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.735094 4913 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.736162 4913 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.736679 4913 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.736719 4913 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.736937 4913 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 16:25:47.811280633 +0000 UTC Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.736998 4913 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1179h47m59.074287365s for next certificate rotation Oct 01 12:37:48 crc kubenswrapper[4913]: E1001 12:37:48.737127 4913 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.737118 4913 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.737515 4913 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.737539 4913 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.737609 4913 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 01 12:37:48 crc kubenswrapper[4913]: E1001 12:37:48.738509 4913 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="200ms" Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.738886 4913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.744935 4913 factory.go:55] Registering systemd factory Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.744964 4913 factory.go:221] Registration of the systemd container factory successfully Oct 01 12:37:48 crc kubenswrapper[4913]: E1001 12:37:48.745032 4913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.745691 4913 factory.go:153] Registering CRI-O factory Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.745720 4913 factory.go:221] Registration of the crio container factory successfully Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.745897 4913 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.745927 4913 factory.go:103] Registering Raw factory Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.745948 4913 manager.go:1196] Started watching for new ooms in manager Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.747441 4913 manager.go:319] Starting recovery of all containers Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.752360 4913 server.go:460] "Adding debug handlers to kubelet server" Oct 01 12:37:48 crc kubenswrapper[4913]: E1001 12:37:48.750049 4913 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.142:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a5e483947bdc8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-01 12:37:48.733259208 +0000 UTC m=+0.636734806,LastTimestamp:2025-10-01 12:37:48.733259208 +0000 UTC m=+0.636734806,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762414 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762534 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762569 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762599 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762627 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762653 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762678 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762700 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762722 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762743 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762763 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762784 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762809 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762845 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762869 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762905 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762926 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762947 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762968 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.762990 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763012 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763032 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763052 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763074 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763094 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763116 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763139 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763164 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763191 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763211 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763231 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763253 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763307 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763331 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763352 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763372 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763392 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763415 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763440 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763462 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763484 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763506 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763527 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763548 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763572 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763592 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763612 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763634 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763655 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763676 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763698 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763721 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763749 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763772 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763795 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763820 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763841 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763863 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763884 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763906 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763927 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763947 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.763972 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764001 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764022 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764042 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764063 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764084 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764102 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764123 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764145 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764166 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764188 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764209 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764228 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764246 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764301 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764321 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764341 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764363 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764382 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764403 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764422 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764444 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764464 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764485 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764506 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764528 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764547 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764567 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764653 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764677 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764699 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764719 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764742 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764764 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764785 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764807 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764825 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764848 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764867 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764890 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764910 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764929 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764963 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.764987 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765008 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765031 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765052 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765072 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765096 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765120 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765143 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765167 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765190 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765211 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765232 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765250 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765300 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765322 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765343 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765362 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765391 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765410 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765428 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.765448 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.771656 4913 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.771734 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.771766 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.771790 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.771813 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.771836 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.771857 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.771878 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.771925 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.771947 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.771968 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.771990 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772027 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772051 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772072 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772096 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772117 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772140 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772162 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772181 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772201 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772221 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772241 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772262 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772324 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772351 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772380 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772407 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772435 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772456 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772481 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772509 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772536 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772566 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772587 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772655 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772683 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772716 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772740 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772767 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772794 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772824 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772852 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772881 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772908 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772940 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772967 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.772995 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773024 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773067 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773097 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773127 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773153 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773174 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773199 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773223 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773244 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773297 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773321 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773340 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773363 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773384 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773418 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773440 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773461 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773483 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773503 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773523 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773541 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773560 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773579 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773599 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773620 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773639 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773659 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773677 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773702 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773725 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773746 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773769 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773789 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773811 4913 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773830 4913 reconstruct.go:97] "Volume reconstruction finished" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.773845 4913 reconciler.go:26] "Reconciler: start to sync state" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.783747 4913 manager.go:324] Recovery completed Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.797015 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.798910 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.799041 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.799071 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.800884 4913 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.800909 4913 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.800931 4913 state_mem.go:36] "Initialized new in-memory state store" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.802218 4913 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.804792 4913 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.805045 4913 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.805212 4913 kubelet.go:2335] "Starting kubelet main sync loop" Oct 01 12:37:48 crc kubenswrapper[4913]: E1001 12:37:48.805627 4913 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 01 12:37:48 crc kubenswrapper[4913]: W1001 12:37:48.806084 4913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Oct 01 12:37:48 crc kubenswrapper[4913]: E1001 12:37:48.806158 4913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.821054 4913 policy_none.go:49] "None policy: Start" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.822372 4913 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.822548 4913 state_mem.go:35] "Initializing new in-memory state store" Oct 01 12:37:48 crc kubenswrapper[4913]: E1001 12:37:48.837249 4913 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.888486 4913 manager.go:334] "Starting Device Plugin manager" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.888575 4913 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.888592 4913 server.go:79] "Starting device plugin registration server" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.889087 4913 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.889106 4913 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.889652 4913 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.889769 4913 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.889778 4913 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 01 12:37:48 crc kubenswrapper[4913]: E1001 12:37:48.897730 4913 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.906102 4913 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.906198 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.907492 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.907516 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.907526 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.907638 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.907761 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.907804 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.908383 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.908405 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.908416 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.908546 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.908549 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.908600 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.908622 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.908669 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.908700 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.910165 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.910405 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.910434 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.910445 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.910565 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.910333 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.910765 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.910822 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.911988 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.912309 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.912333 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.912356 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.912368 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.912355 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.912474 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.912479 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.912589 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.912628 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.913437 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.913462 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.913473 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.913621 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.913648 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.914261 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.914315 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.914327 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.915322 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.915357 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.915369 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4913]: E1001 12:37:48.939056 4913 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="400ms" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.977677 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.977762 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.977799 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.977867 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.977926 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.977957 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.978017 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.978051 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.978513 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.978593 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.978630 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.978707 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.978753 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.978790 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.978837 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.989325 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.990722 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.990756 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.990764 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4913]: I1001 12:37:48.990786 4913 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:37:48 crc kubenswrapper[4913]: E1001 12:37:48.991216 4913 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.079777 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.079832 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.079861 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.079884 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.079905 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.079926 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.079947 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.079967 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.079988 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080009 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080027 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080048 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080066 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080091 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080110 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080505 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080552 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080581 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080601 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080616 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080847 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080559 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080906 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080560 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080829 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.081006 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.081105 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.081123 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.081129 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.080569 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.192338 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.193876 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.193906 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.193914 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.193936 4913 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:37:49 crc kubenswrapper[4913]: E1001 12:37:49.194417 4913 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.232933 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.238664 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.257846 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.275094 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.281346 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:49 crc kubenswrapper[4913]: W1001 12:37:49.282138 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3fe33529f549bd2a213d74a13278a30d3b648f21e92ed1a5eda71b664d8adea0 WatchSource:0}: Error finding container 3fe33529f549bd2a213d74a13278a30d3b648f21e92ed1a5eda71b664d8adea0: Status 404 returned error can't find the container with id 3fe33529f549bd2a213d74a13278a30d3b648f21e92ed1a5eda71b664d8adea0 Oct 01 12:37:49 crc kubenswrapper[4913]: W1001 12:37:49.283089 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-303744aa90393cce610fa908d5faca8bad5f0d108915b1b9a9f19240f59189de WatchSource:0}: Error finding container 303744aa90393cce610fa908d5faca8bad5f0d108915b1b9a9f19240f59189de: Status 404 returned error can't find the container with id 303744aa90393cce610fa908d5faca8bad5f0d108915b1b9a9f19240f59189de Oct 01 12:37:49 crc kubenswrapper[4913]: W1001 12:37:49.290777 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-4ed78e6ce5ce09bc356fac89fb98dacc596ac2b77a1fcff8b8946cf19fb5fe3d WatchSource:0}: Error finding container 4ed78e6ce5ce09bc356fac89fb98dacc596ac2b77a1fcff8b8946cf19fb5fe3d: Status 404 returned error can't find the container with id 4ed78e6ce5ce09bc356fac89fb98dacc596ac2b77a1fcff8b8946cf19fb5fe3d Oct 01 12:37:49 crc kubenswrapper[4913]: W1001 12:37:49.292235 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-dae5b42a5bfc27216713dbc60ead60fea190fcdfe7dc9df02ff753e6a4ef8c4b WatchSource:0}: Error finding container dae5b42a5bfc27216713dbc60ead60fea190fcdfe7dc9df02ff753e6a4ef8c4b: Status 404 returned error can't find the container with id dae5b42a5bfc27216713dbc60ead60fea190fcdfe7dc9df02ff753e6a4ef8c4b Oct 01 12:37:49 crc kubenswrapper[4913]: E1001 12:37:49.339656 4913 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="800ms" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.595381 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.596719 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.596764 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.596779 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.596810 4913 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:37:49 crc kubenswrapper[4913]: E1001 12:37:49.597306 4913 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Oct 01 12:37:49 crc kubenswrapper[4913]: W1001 12:37:49.614880 4913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Oct 01 12:37:49 crc kubenswrapper[4913]: E1001 12:37:49.614951 4913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:49 crc kubenswrapper[4913]: W1001 12:37:49.616144 4913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Oct 01 12:37:49 crc kubenswrapper[4913]: E1001 12:37:49.616183 4913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.738375 4913 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Oct 01 12:37:49 crc kubenswrapper[4913]: W1001 12:37:49.776771 4913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Oct 01 12:37:49 crc kubenswrapper[4913]: E1001 12:37:49.776836 4913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.809474 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d2e8c0950a36ea3a32144df02f750542dc9a8ce71a1ccf430bb2429e279d0e25"} Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.810977 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dae5b42a5bfc27216713dbc60ead60fea190fcdfe7dc9df02ff753e6a4ef8c4b"} Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.814643 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4ed78e6ce5ce09bc356fac89fb98dacc596ac2b77a1fcff8b8946cf19fb5fe3d"} Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.815993 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"303744aa90393cce610fa908d5faca8bad5f0d108915b1b9a9f19240f59189de"} Oct 01 12:37:49 crc kubenswrapper[4913]: I1001 12:37:49.817684 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3fe33529f549bd2a213d74a13278a30d3b648f21e92ed1a5eda71b664d8adea0"} Oct 01 12:37:50 crc kubenswrapper[4913]: W1001 12:37:50.002548 4913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Oct 01 12:37:50 crc kubenswrapper[4913]: E1001 12:37:50.002641 4913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:50 crc kubenswrapper[4913]: E1001 12:37:50.140645 4913 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="1.6s" Oct 01 12:37:50 crc kubenswrapper[4913]: E1001 12:37:50.309172 4913 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.142:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a5e483947bdc8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-01 12:37:48.733259208 +0000 UTC m=+0.636734806,LastTimestamp:2025-10-01 12:37:48.733259208 +0000 UTC m=+0.636734806,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.397617 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.398987 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.399022 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.399032 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.399057 4913 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:37:50 crc kubenswrapper[4913]: E1001 12:37:50.399526 4913 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.738383 4913 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.822831 4913 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="172032088e0e106a1a8d693e321529f0797c87184f058584fd31b3b0e51a4e2f" exitCode=0 Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.822984 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.822991 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"172032088e0e106a1a8d693e321529f0797c87184f058584fd31b3b0e51a4e2f"} Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.824420 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.824458 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.824469 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.825153 4913 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee" exitCode=0 Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.825225 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.825330 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee"} Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.826431 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.826478 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.826496 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.830583 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1"} Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.830649 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76"} Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.830680 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e"} Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.830602 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.830706 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe"} Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.832889 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.832937 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.832956 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.833399 4913 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2" exitCode=0 Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.833541 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2"} Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.833648 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.834989 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.835035 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.835052 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.837124 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.837213 4913 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c194f3065b59be7df56f6732211fd7b73d965c39197e8598cc07619dda822a54" exitCode=0 Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.837303 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c194f3065b59be7df56f6732211fd7b73d965c39197e8598cc07619dda822a54"} Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.837376 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.838029 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.838072 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.838091 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.838397 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.838430 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4913]: I1001 12:37:50.838441 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4913]: W1001 12:37:51.573697 4913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Oct 01 12:37:51 crc kubenswrapper[4913]: E1001 12:37:51.573772 4913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.738040 4913 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Oct 01 12:37:51 crc kubenswrapper[4913]: E1001 12:37:51.741915 4913 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="3.2s" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.842053 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1e4cb0a01560cd286f1a1eb344316f430de87357fc7019e4610f276ac0fcdcb5"} Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.842096 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.842108 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"556741f4cc887b7c526046216bfccedeec0a4f62e97b24f4f29538e58e8f9704"} Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.842123 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d08aeac6119407218318ebc6ab207bf9b85b592d6a588a9dea2ba38e6d9ed7fa"} Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.843110 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.843168 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.843178 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.845317 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"55ee569c1b045fa148dd3b476f5a562eb858e4d54cc3599564ee9def4d17e4da"} Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.845341 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7"} Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.845350 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da"} Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.845359 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d"} Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.845368 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d"} Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.845444 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.846146 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.846167 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.846176 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.847753 4913 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="703710d46ece24d979229e65e8473eb82f80e0eea4ff91cc50507a0a17d0336a" exitCode=0 Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.847802 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"703710d46ece24d979229e65e8473eb82f80e0eea4ff91cc50507a0a17d0336a"} Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.847915 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.848829 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.848852 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.848862 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.851161 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.851622 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.851973 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6dbb602d4d3bc55422c43611204b0ef2761acaac79c7de903ccb0ce7289007b6"} Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.852512 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.852531 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.852540 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.852975 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.852992 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4913]: I1001 12:37:51.853000 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.000073 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.002450 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.002483 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.002496 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.002521 4913 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:37:52 crc kubenswrapper[4913]: E1001 12:37:52.003017 4913 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Oct 01 12:37:52 crc kubenswrapper[4913]: W1001 12:37:52.196411 4913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Oct 01 12:37:52 crc kubenswrapper[4913]: E1001 12:37:52.196505 4913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.398854 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.405809 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.856555 4913 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4efc2fd8648bebea8996579fe75afdcfe005bcc0b4632d9673cc1eacdd8a6049" exitCode=0 Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.856650 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.856718 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4efc2fd8648bebea8996579fe75afdcfe005bcc0b4632d9673cc1eacdd8a6049"} Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.856753 4913 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.856813 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.856651 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.856759 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.856758 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.856734 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.857876 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.857909 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.857925 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.857949 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.857968 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.857977 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.858781 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.858792 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.858805 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.858809 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.858815 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.858820 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.859481 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.859511 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4913]: I1001 12:37:52.859712 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.348012 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.864453 4913 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.864507 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.864961 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.865031 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e0150bbce62c26f62c9fd16bbf522d8e1ada5b8f5f4e4a9d8670730851f7fce5"} Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.865071 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0f04f0b55a8a29ed94196dab28bfcf55f8eb251c6d65ebc06f56b28f821d108a"} Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.865080 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"db54a5acbe5757fbcc197e3fab686adecb87d4c6ca27d2d97cf20471e2c6129a"} Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.865089 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e9f2aa06b5938b48d97c0cb55b00bdcaa3ae67ab88a15832c53a6397e6bea299"} Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.865098 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"31e3cf40fb768f0fe3d31a69e894ca680b88e19b98ed410f010c2468697a1450"} Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.865163 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.865498 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.865530 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.865540 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.865771 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.865804 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.865830 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.867463 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.867485 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:53 crc kubenswrapper[4913]: I1001 12:37:53.867515 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:54 crc kubenswrapper[4913]: I1001 12:37:54.779872 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:54 crc kubenswrapper[4913]: I1001 12:37:54.867006 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:54 crc kubenswrapper[4913]: I1001 12:37:54.867105 4913 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:37:54 crc kubenswrapper[4913]: I1001 12:37:54.867185 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:54 crc kubenswrapper[4913]: I1001 12:37:54.867231 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:54 crc kubenswrapper[4913]: I1001 12:37:54.867856 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:54 crc kubenswrapper[4913]: I1001 12:37:54.867887 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:54 crc kubenswrapper[4913]: I1001 12:37:54.867896 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:54 crc kubenswrapper[4913]: I1001 12:37:54.869007 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:54 crc kubenswrapper[4913]: I1001 12:37:54.869025 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:54 crc kubenswrapper[4913]: I1001 12:37:54.869033 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:54 crc kubenswrapper[4913]: I1001 12:37:54.869005 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:54 crc kubenswrapper[4913]: I1001 12:37:54.869172 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:54 crc kubenswrapper[4913]: I1001 12:37:54.869214 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.159125 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.203574 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.206045 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.206102 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.206127 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.206163 4913 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.841492 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.869591 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.869646 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.870469 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.870501 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.870510 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.871188 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.871233 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:55 crc kubenswrapper[4913]: I1001 12:37:55.871244 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:56 crc kubenswrapper[4913]: I1001 12:37:56.170949 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 01 12:37:56 crc kubenswrapper[4913]: I1001 12:37:56.171091 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:56 crc kubenswrapper[4913]: I1001 12:37:56.171958 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:56 crc kubenswrapper[4913]: I1001 12:37:56.171999 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:56 crc kubenswrapper[4913]: I1001 12:37:56.172014 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:56 crc kubenswrapper[4913]: I1001 12:37:56.347975 4913 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 12:37:56 crc kubenswrapper[4913]: I1001 12:37:56.348088 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 12:37:56 crc kubenswrapper[4913]: I1001 12:37:56.706991 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:56 crc kubenswrapper[4913]: I1001 12:37:56.707387 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:56 crc kubenswrapper[4913]: I1001 12:37:56.708812 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:56 crc kubenswrapper[4913]: I1001 12:37:56.708880 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:56 crc kubenswrapper[4913]: I1001 12:37:56.708899 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:58 crc kubenswrapper[4913]: I1001 12:37:58.569816 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 01 12:37:58 crc kubenswrapper[4913]: I1001 12:37:58.570016 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:58 crc kubenswrapper[4913]: I1001 12:37:58.571333 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:58 crc kubenswrapper[4913]: I1001 12:37:58.571376 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:58 crc kubenswrapper[4913]: I1001 12:37:58.571388 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:58 crc kubenswrapper[4913]: I1001 12:37:58.685996 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:58 crc kubenswrapper[4913]: I1001 12:37:58.686160 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:58 crc kubenswrapper[4913]: I1001 12:37:58.687386 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:58 crc kubenswrapper[4913]: I1001 12:37:58.687419 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:58 crc kubenswrapper[4913]: I1001 12:37:58.687429 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:58 crc kubenswrapper[4913]: E1001 12:37:58.897836 4913 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 12:38:02 crc kubenswrapper[4913]: W1001 12:38:02.467186 4913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.467313 4913 trace.go:236] Trace[1953406068]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 12:37:52.465) (total time: 10001ms): Oct 01 12:38:02 crc kubenswrapper[4913]: Trace[1953406068]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:38:02.467) Oct 01 12:38:02 crc kubenswrapper[4913]: Trace[1953406068]: [10.001476317s] [10.001476317s] END Oct 01 12:38:02 crc kubenswrapper[4913]: E1001 12:38:02.467342 4913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.539934 4913 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:34894->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.539986 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:34894->192.168.126.11:17697: read: connection reset by peer" Oct 01 12:38:02 crc kubenswrapper[4913]: W1001 12:38:02.733164 4913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.733248 4913 trace.go:236] Trace[294655441]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 12:37:52.731) (total time: 10001ms): Oct 01 12:38:02 crc kubenswrapper[4913]: Trace[294655441]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:38:02.733) Oct 01 12:38:02 crc kubenswrapper[4913]: Trace[294655441]: [10.001589217s] [10.001589217s] END Oct 01 12:38:02 crc kubenswrapper[4913]: E1001 12:38:02.733288 4913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.738962 4913 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.894753 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.896813 4913 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="55ee569c1b045fa148dd3b476f5a562eb858e4d54cc3599564ee9def4d17e4da" exitCode=255 Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.896852 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"55ee569c1b045fa148dd3b476f5a562eb858e4d54cc3599564ee9def4d17e4da"} Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.897125 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.898285 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.898319 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.898329 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.898792 4913 scope.go:117] "RemoveContainer" containerID="55ee569c1b045fa148dd3b476f5a562eb858e4d54cc3599564ee9def4d17e4da" Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.992241 4913 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.992324 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.997278 4913 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 12:38:02 crc kubenswrapper[4913]: I1001 12:38:02.997333 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 12:38:03 crc kubenswrapper[4913]: I1001 12:38:03.900359 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 12:38:03 crc kubenswrapper[4913]: I1001 12:38:03.901857 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b"} Oct 01 12:38:03 crc kubenswrapper[4913]: I1001 12:38:03.902066 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:38:03 crc kubenswrapper[4913]: I1001 12:38:03.903037 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:03 crc kubenswrapper[4913]: I1001 12:38:03.903071 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:03 crc kubenswrapper[4913]: I1001 12:38:03.903080 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.166168 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.166354 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.167629 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.167666 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.167678 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.849054 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.849197 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.849335 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.850637 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.850700 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.850719 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.855295 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.906353 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.907294 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.907323 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:05 crc kubenswrapper[4913]: I1001 12:38:05.907332 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.208774 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.209011 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.210599 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.210652 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.210671 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.227609 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.349230 4913 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.349398 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.908524 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.908634 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.909966 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.910171 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.910386 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.909996 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.910700 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:06 crc kubenswrapper[4913]: I1001 12:38:06.910726 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.020532 4913 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.110324 4913 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.731724 4913 apiserver.go:52] "Watching apiserver" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.738343 4913 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.738673 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.739068 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.739105 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:38:07 crc kubenswrapper[4913]: E1001 12:38:07.739165 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.739387 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.739740 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:07 crc kubenswrapper[4913]: E1001 12:38:07.739876 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.739751 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.740405 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:07 crc kubenswrapper[4913]: E1001 12:38:07.740822 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.741915 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.742424 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.742651 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.742727 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.742854 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.743801 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.744001 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.745187 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.746110 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.768827 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.781397 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.793774 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.809427 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.822422 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.831146 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.838648 4913 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.839591 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.849487 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:07 crc kubenswrapper[4913]: E1001 12:38:07.979150 4913 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.981330 4913 trace.go:236] Trace[546928567]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 12:37:57.594) (total time: 10386ms): Oct 01 12:38:07 crc kubenswrapper[4913]: Trace[546928567]: ---"Objects listed" error: 10386ms (12:38:07.981) Oct 01 12:38:07 crc kubenswrapper[4913]: Trace[546928567]: [10.386290851s] [10.386290851s] END Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.981368 4913 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.982102 4913 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.982165 4913 trace.go:236] Trace[923305634]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 12:37:56.310) (total time: 11671ms): Oct 01 12:38:07 crc kubenswrapper[4913]: Trace[923305634]: ---"Objects listed" error: 11671ms (12:38:07.982) Oct 01 12:38:07 crc kubenswrapper[4913]: Trace[923305634]: [11.671217092s] [11.671217092s] END Oct 01 12:38:07 crc kubenswrapper[4913]: I1001 12:38:07.982367 4913 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 01 12:38:07 crc kubenswrapper[4913]: E1001 12:38:07.983180 4913 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.082648 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.082682 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.082702 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.082720 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.082737 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.082750 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083509 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083555 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083587 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083606 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083630 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083651 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083670 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083802 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083824 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083844 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083860 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083831 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083881 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083903 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083924 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083944 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.083960 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084059 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084082 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084092 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084215 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084236 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084405 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084477 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084512 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084609 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084713 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084800 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084821 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084845 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084917 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085006 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085032 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085126 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085212 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085314 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085481 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085545 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085565 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085644 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085751 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085879 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085913 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086050 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086141 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086283 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086308 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086454 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086571 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086594 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086639 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086721 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086792 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086812 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086832 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086921 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.087010 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.087069 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.087217 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084166 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084324 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084556 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084729 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084866 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.084917 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085031 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085146 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085622 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085678 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085688 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085702 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085655 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.085879 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086105 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086156 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086145 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086299 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086349 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086552 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086711 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086933 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086953 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.086925 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.087376 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.087565 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.087779 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.087806 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.087849 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.087860 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.087906 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.087942 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.087974 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088013 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088041 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088070 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088093 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088121 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088124 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088146 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088172 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088197 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088220 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088246 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088251 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088441 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088295 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088663 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088569 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088690 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088703 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088868 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.089036 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.089064 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.089294 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.089537 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.090016 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.090315 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.090618 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.090864 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.090995 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091117 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091120 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091191 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091429 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091458 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091489 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.088762 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091569 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091664 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091710 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091725 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091737 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091719 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091767 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091797 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091823 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091850 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091883 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091909 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091934 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091963 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.091989 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.092011 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.092020 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.092040 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.092069 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.092102 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.092152 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.092129 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.092448 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.092852 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.093066 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.094821 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.095593 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.095671 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.095714 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.095757 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.095772 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.095795 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.095830 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.095864 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.095903 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.095939 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.095974 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096012 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096070 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096110 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096398 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096523 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096571 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096608 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096646 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096681 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096715 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096751 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096786 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096821 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096858 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096895 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096941 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.096979 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097031 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097133 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097178 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097219 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097257 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097325 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097356 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097361 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097388 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097421 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097452 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097478 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097504 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097527 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097550 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097574 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097601 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097622 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097643 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097664 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097670 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097700 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097691 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097705 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097771 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097791 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097816 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097860 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097914 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097950 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.097986 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098025 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098061 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098087 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098102 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098007 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098139 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098175 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098205 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098235 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098312 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098323 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098379 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098426 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098509 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098504 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098552 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098627 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098678 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098745 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098718 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098780 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098801 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098809 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098838 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098868 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098958 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.098986 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099059 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099103 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099435 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099489 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099524 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099573 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099601 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099626 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099649 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099674 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099699 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099724 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099749 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099778 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099804 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099827 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099851 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099878 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099903 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099932 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099958 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.099984 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100081 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100110 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100136 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100163 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100187 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100210 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100236 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100283 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100313 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100338 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100391 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100424 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100452 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100476 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100501 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100580 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100612 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100638 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100905 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.100926 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.101059 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.101042 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.101205 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.101370 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.101701 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.101988 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.102205 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.102745 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.103007 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.103051 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.103933 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104162 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104496 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104532 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104563 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104590 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104615 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104642 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104726 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104741 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104754 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104767 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104780 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104793 4913 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104805 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104819 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104833 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104845 4913 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104857 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104869 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104882 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104895 4913 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104909 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104928 4913 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104943 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104957 4913 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104972 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104988 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105044 4913 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105056 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105070 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105085 4913 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105099 4913 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105112 4913 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105124 4913 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105137 4913 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105193 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105208 4913 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105222 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105235 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105248 4913 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105279 4913 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105294 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105310 4913 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105323 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105335 4913 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105348 4913 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105360 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105743 4913 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105759 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105796 4913 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105887 4913 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105905 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105919 4913 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105932 4913 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105945 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105958 4913 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105973 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105987 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106001 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106014 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106027 4913 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106040 4913 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106054 4913 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106067 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106080 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106092 4913 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106107 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106119 4913 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106134 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106146 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106158 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106173 4913 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106186 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106198 4913 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106211 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106223 4913 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106238 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106250 4913 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106278 4913 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106294 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106308 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106320 4913 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106333 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106348 4913 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106361 4913 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106378 4913 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106392 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106405 4913 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106418 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106429 4913 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106442 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106476 4913 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106490 4913 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106503 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106518 4913 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106531 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106544 4913 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106556 4913 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106569 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106582 4913 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106594 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106606 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106619 4913 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.104969 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105077 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105291 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.105388 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106162 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106232 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106248 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106315 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.106411 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.107056 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.107644 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.107812 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.107879 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.109166 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.109253 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.109350 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.109694 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.110025 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.110697 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.110905 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.111091 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.111474 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.111540 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.111715 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.111902 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.112664 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:38:08.612624237 +0000 UTC m=+20.516099855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.112915 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.112977 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.113774 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.114075 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.114378 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.114744 4913 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.114780 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.114825 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.114840 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:08.614817038 +0000 UTC m=+20.518292716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.114896 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.114912 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.114746 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.114939 4913 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.115026 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:08.615013944 +0000 UTC m=+20.518489682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.114837 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.115447 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.115468 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.115484 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.115882 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.115933 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.115974 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.116099 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.116362 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.116717 4913 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.116868 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.116972 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.117204 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.117208 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.117452 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.117618 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.117738 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.117868 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.118214 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.118258 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.118390 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.118500 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.118183 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.121963 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.122300 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.122791 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.123058 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.123469 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.124293 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.125050 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.130788 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.130862 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.130881 4913 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.131016 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:08.630970476 +0000 UTC m=+20.534446144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.131865 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.134153 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.135191 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.136622 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.136648 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.136659 4913 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.138966 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.139064 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.139152 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:08.639131332 +0000 UTC m=+20.542606920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.139741 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.140930 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.141062 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.142053 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.142515 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.143254 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.144152 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.144182 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.144609 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.144763 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.144793 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.144843 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.144855 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.145092 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.145640 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.145705 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.145835 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.145932 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.145954 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.146076 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.146156 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.146534 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.146790 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.147374 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.148365 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.148544 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.148907 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.149080 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.149099 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.149106 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.149209 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.149331 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.150012 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.157627 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.158798 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.165914 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.178961 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207302 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207354 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207400 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207411 4913 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207420 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207429 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207438 4913 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207446 4913 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207455 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207464 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207472 4913 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207480 4913 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207488 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207496 4913 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207504 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207513 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207520 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207528 4913 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207532 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207536 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.207396 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208142 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208198 4913 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208220 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208235 4913 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208250 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208284 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208300 4913 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208315 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208328 4913 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208341 4913 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208354 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208367 4913 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208379 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208391 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208403 4913 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208416 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208428 4913 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208440 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208453 4913 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208465 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208477 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208488 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208500 4913 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208512 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208525 4913 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208537 4913 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208549 4913 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208560 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208573 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208584 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208597 4913 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208610 4913 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208621 4913 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208633 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208644 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208656 4913 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208713 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208795 4913 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208815 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208831 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208844 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208858 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208870 4913 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208882 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208894 4913 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208905 4913 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208917 4913 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208928 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208940 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208951 4913 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208963 4913 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.208975 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209020 4913 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209033 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209044 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209056 4913 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209067 4913 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209079 4913 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209092 4913 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209104 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209116 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209129 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209141 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209152 4913 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209164 4913 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209176 4913 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209193 4913 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209210 4913 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209226 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209243 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209257 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209297 4913 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209311 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209334 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209347 4913 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209362 4913 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209378 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209393 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209409 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209423 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.209436 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.363988 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.377530 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:38:08 crc kubenswrapper[4913]: W1001 12:38:08.378195 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-78d1e15ad160c50d054e85f8e29057c92851617d703d349d7e0754ea84f0c1de WatchSource:0}: Error finding container 78d1e15ad160c50d054e85f8e29057c92851617d703d349d7e0754ea84f0c1de: Status 404 returned error can't find the container with id 78d1e15ad160c50d054e85f8e29057c92851617d703d349d7e0754ea84f0c1de Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.390731 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:38:08 crc kubenswrapper[4913]: W1001 12:38:08.390799 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-41e287d95fe0b26500b71d96fed6416a53020bb16ecf285bb666f8ec57d8c8de WatchSource:0}: Error finding container 41e287d95fe0b26500b71d96fed6416a53020bb16ecf285bb666f8ec57d8c8de: Status 404 returned error can't find the container with id 41e287d95fe0b26500b71d96fed6416a53020bb16ecf285bb666f8ec57d8c8de Oct 01 12:38:08 crc kubenswrapper[4913]: W1001 12:38:08.401042 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-a28f6ab4080f9f1ed6d0a706a57647254eac0aa210640fa3d01a8ccddba2e28e WatchSource:0}: Error finding container a28f6ab4080f9f1ed6d0a706a57647254eac0aa210640fa3d01a8ccddba2e28e: Status 404 returned error can't find the container with id a28f6ab4080f9f1ed6d0a706a57647254eac0aa210640fa3d01a8ccddba2e28e Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.713192 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.713299 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.713359 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.713385 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.713412 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:38:09.713387279 +0000 UTC m=+21.616862857 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.713454 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.713479 4913 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.713485 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.713516 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.713528 4913 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.713539 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:09.713519872 +0000 UTC m=+21.616995450 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.713565 4913 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.713592 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.713607 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.713618 4913 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.713576 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:09.713560724 +0000 UTC m=+21.617036292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.713663 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:09.713643006 +0000 UTC m=+21.617118574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.713677 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:09.713671057 +0000 UTC m=+21.617146635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.808012 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.808112 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.809768 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.810239 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.811397 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.811966 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.812866 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.813389 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.813953 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.814852 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.815427 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.816294 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.816738 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.817739 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.818185 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.818554 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.818726 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.819609 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.820084 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.821758 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.822121 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.822698 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.823615 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.824036 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.825247 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.825674 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.826675 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.827091 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.827688 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.828854 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.829427 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.829909 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.830441 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.830975 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.831979 4913 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.832101 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.834035 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.835153 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.835787 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.837251 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.838022 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.839106 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.839723 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.840745 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.841210 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.841718 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.842255 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.842864 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.843864 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.844302 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.845170 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.845654 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.846713 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.847139 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.847900 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.848336 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.849175 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.849795 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.850211 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.851920 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.860538 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.868875 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.913344 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030"} Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.913394 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"78d1e15ad160c50d054e85f8e29057c92851617d703d349d7e0754ea84f0c1de"} Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.914835 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.915205 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.917153 4913 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b" exitCode=255 Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.917222 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b"} Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.917283 4913 scope.go:117] "RemoveContainer" containerID="55ee569c1b045fa148dd3b476f5a562eb858e4d54cc3599564ee9def4d17e4da" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.918670 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a28f6ab4080f9f1ed6d0a706a57647254eac0aa210640fa3d01a8ccddba2e28e"} Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.920099 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9"} Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.920122 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1"} Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.920132 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"41e287d95fe0b26500b71d96fed6416a53020bb16ecf285bb666f8ec57d8c8de"} Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.927645 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.929144 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.929190 4913 scope.go:117] "RemoveContainer" containerID="ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b" Oct 01 12:38:08 crc kubenswrapper[4913]: E1001 12:38:08.929596 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.936713 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.959632 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.976877 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.989854 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:08 crc kubenswrapper[4913]: I1001 12:38:08.998608 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.007260 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55ee569c1b045fa148dd3b476f5a562eb858e4d54cc3599564ee9def4d17e4da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:02Z\\\",\\\"message\\\":\\\"W1001 12:37:51.810798 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 12:37:51.811167 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759322271 cert, and key in /tmp/serving-cert-1778987673/serving-signer.crt, /tmp/serving-cert-1778987673/serving-signer.key\\\\nI1001 12:37:52.095398 1 observer_polling.go:159] Starting file observer\\\\nW1001 12:37:52.097151 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 12:37:52.097413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:52.098629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1778987673/tls.crt::/tmp/serving-cert-1778987673/tls.key\\\\\\\"\\\\nF1001 12:38:02.535255 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.020547 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.028508 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.036819 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.047996 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.057226 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.072426 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.516703 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-t7565"] Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.517018 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t7565" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.517731 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zqn52"] Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.518078 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.525957 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.526074 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.526085 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.525967 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.526282 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.526025 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.526462 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.526562 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.541807 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.555619 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.569152 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.581888 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55ee569c1b045fa148dd3b476f5a562eb858e4d54cc3599564ee9def4d17e4da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:02Z\\\",\\\"message\\\":\\\"W1001 12:37:51.810798 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 12:37:51.811167 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759322271 cert, and key in /tmp/serving-cert-1778987673/serving-signer.crt, /tmp/serving-cert-1778987673/serving-signer.key\\\\nI1001 12:37:52.095398 1 observer_polling.go:159] Starting file observer\\\\nW1001 12:37:52.097151 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 12:37:52.097413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:52.098629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1778987673/tls.crt::/tmp/serving-cert-1778987673/tls.key\\\\\\\"\\\\nF1001 12:38:02.535255 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.595899 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.611146 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618206 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f76l\" (UniqueName: \"kubernetes.io/projected/b2420adf-64bd-4d67-ac95-9337ed10149a-kube-api-access-4f76l\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618245 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-system-cni-dir\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618288 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-os-release\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618308 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b2420adf-64bd-4d67-ac95-9337ed10149a-multus-daemon-config\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618394 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-var-lib-kubelet\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618426 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-multus-cni-dir\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618485 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-cnibin\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618535 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-multus-socket-dir-parent\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618557 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-run-netns\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618588 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-var-lib-cni-multus\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618612 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-hostroot\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618634 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhsm2\" (UniqueName: \"kubernetes.io/projected/0cfb3767-c920-41f1-9c7b-88828a9a4ba4-kube-api-access-xhsm2\") pod \"node-resolver-t7565\" (UID: \"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\") " pod="openshift-dns/node-resolver-t7565" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618664 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-var-lib-cni-bin\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618685 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-etc-kubernetes\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618710 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0cfb3767-c920-41f1-9c7b-88828a9a4ba4-hosts-file\") pod \"node-resolver-t7565\" (UID: \"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\") " pod="openshift-dns/node-resolver-t7565" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618745 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2420adf-64bd-4d67-ac95-9337ed10149a-cni-binary-copy\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618769 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-run-k8s-cni-cncf-io\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618814 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-multus-conf-dir\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.618836 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-run-multus-certs\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.624829 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.639479 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.656004 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.675259 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55ee569c1b045fa148dd3b476f5a562eb858e4d54cc3599564ee9def4d17e4da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:02Z\\\",\\\"message\\\":\\\"W1001 12:37:51.810798 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 12:37:51.811167 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759322271 cert, and key in /tmp/serving-cert-1778987673/serving-signer.crt, /tmp/serving-cert-1778987673/serving-signer.key\\\\nI1001 12:37:52.095398 1 observer_polling.go:159] Starting file observer\\\\nW1001 12:37:52.097151 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 12:37:52.097413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:52.098629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1778987673/tls.crt::/tmp/serving-cert-1778987673/tls.key\\\\\\\"\\\\nF1001 12:38:02.535255 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.689453 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.699092 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.711824 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.719803 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.719876 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-multus-conf-dir\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.719896 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-run-multus-certs\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.719920 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.719941 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:38:11.719917696 +0000 UTC m=+23.623393274 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.719981 4913 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.719981 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720052 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-multus-conf-dir\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720031 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-run-multus-certs\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.720036 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:11.720020649 +0000 UTC m=+23.623496307 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720125 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f76l\" (UniqueName: \"kubernetes.io/projected/b2420adf-64bd-4d67-ac95-9337ed10149a-kube-api-access-4f76l\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720161 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-system-cni-dir\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720182 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-os-release\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720202 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b2420adf-64bd-4d67-ac95-9337ed10149a-multus-daemon-config\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720236 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-system-cni-dir\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720239 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.720069 4913 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720300 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-multus-cni-dir\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720319 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-cnibin\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.720331 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:11.720321937 +0000 UTC m=+23.623797515 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.720334 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720350 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-multus-socket-dir-parent\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.720357 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.720370 4913 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720374 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-run-netns\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720391 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-var-lib-cni-multus\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.720413 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:11.720396049 +0000 UTC m=+23.623871727 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720421 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-var-lib-cni-multus\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720425 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-multus-socket-dir-parent\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720436 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-multus-cni-dir\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720351 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-cnibin\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720449 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-run-netns\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720435 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-var-lib-kubelet\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720463 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-var-lib-kubelet\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720482 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-hostroot\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720499 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhsm2\" (UniqueName: \"kubernetes.io/projected/0cfb3767-c920-41f1-9c7b-88828a9a4ba4-kube-api-access-xhsm2\") pod \"node-resolver-t7565\" (UID: \"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\") " pod="openshift-dns/node-resolver-t7565" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720513 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0cfb3767-c920-41f1-9c7b-88828a9a4ba4-hosts-file\") pod \"node-resolver-t7565\" (UID: \"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\") " pod="openshift-dns/node-resolver-t7565" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720538 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2420adf-64bd-4d67-ac95-9337ed10149a-cni-binary-copy\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720544 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-os-release\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720551 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-run-k8s-cni-cncf-io\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720571 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-run-k8s-cni-cncf-io\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720585 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-var-lib-cni-bin\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720601 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-etc-kubernetes\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720640 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.720724 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720729 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-hostroot\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.720737 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.720764 4913 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720767 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-etc-kubernetes\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720602 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0cfb3767-c920-41f1-9c7b-88828a9a4ba4-hosts-file\") pod \"node-resolver-t7565\" (UID: \"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\") " pod="openshift-dns/node-resolver-t7565" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720752 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b2420adf-64bd-4d67-ac95-9337ed10149a-host-var-lib-cni-bin\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.720799 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:11.72078899 +0000 UTC m=+23.624264678 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.720966 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b2420adf-64bd-4d67-ac95-9337ed10149a-multus-daemon-config\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.721020 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2420adf-64bd-4d67-ac95-9337ed10149a-cni-binary-copy\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.725166 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.746608 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.747049 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhsm2\" (UniqueName: \"kubernetes.io/projected/0cfb3767-c920-41f1-9c7b-88828a9a4ba4-kube-api-access-xhsm2\") pod \"node-resolver-t7565\" (UID: \"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\") " pod="openshift-dns/node-resolver-t7565" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.747283 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f76l\" (UniqueName: \"kubernetes.io/projected/b2420adf-64bd-4d67-ac95-9337ed10149a-kube-api-access-4f76l\") pod \"multus-zqn52\" (UID: \"b2420adf-64bd-4d67-ac95-9337ed10149a\") " pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.794937 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.806436 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.806504 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.806568 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.806811 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.820035 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.828045 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t7565" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.836802 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zqn52" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.915761 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-57qvb"] Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.916618 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-7v2m5"] Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.916878 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.917352 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-8hltg"] Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.917482 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.917880 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.920761 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.921550 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.921734 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 01 12:38:09 crc kubenswrapper[4913]: W1001 12:38:09.921759 4913 reflector.go:561] object-"openshift-machine-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 01 12:38:09 crc kubenswrapper[4913]: W1001 12:38:09.921761 4913 reflector.go:561] object-"openshift-machine-config-operator"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.921783 4913 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.921798 4913 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 12:38:09 crc kubenswrapper[4913]: W1001 12:38:09.921934 4913 reflector.go:561] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 01 12:38:09 crc kubenswrapper[4913]: W1001 12:38:09.921938 4913 reflector.go:561] object-"openshift-machine-config-operator"/"proxy-tls": failed to list *v1.Secret: secrets "proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.921960 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.921957 4913 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.921959 4913 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.922566 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.922604 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.924524 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqn52" event={"ID":"b2420adf-64bd-4d67-ac95-9337ed10149a","Type":"ContainerStarted","Data":"178417008fb9a2b6b10f6cf48630e682c7f3d30b517b8b42cdbf6edb24e9b9a4"} Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.925389 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 01 12:38:09 crc kubenswrapper[4913]: W1001 12:38:09.925469 4913 reflector.go:561] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": failed to list *v1.Secret: secrets "machine-config-daemon-dockercfg-r5tcq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.925493 4913 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-r5tcq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-config-daemon-dockercfg-r5tcq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.925560 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.927891 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.928973 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t7565" event={"ID":"0cfb3767-c920-41f1-9c7b-88828a9a4ba4","Type":"ContainerStarted","Data":"295fc9e046e7c1c2dcc847ce216d41c171283d01d00129ba38e02773ee3dcf64"} Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.932410 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.935392 4913 scope.go:117] "RemoveContainer" containerID="ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b" Oct 01 12:38:09 crc kubenswrapper[4913]: E1001 12:38:09.935575 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.963689 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:09 crc kubenswrapper[4913]: I1001 12:38:09.986192 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.005097 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.022334 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-env-overrides\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.022612 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-kubelet\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.022424 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.022696 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-etc-openvswitch\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.022832 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-run-netns\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.022854 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e8903e6e-381f-4f5c-b9c5-5242c3de2897-rootfs\") pod \"machine-config-daemon-8hltg\" (UID: \"e8903e6e-381f-4f5c-b9c5-5242c3de2897\") " pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.022871 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td6wp\" (UniqueName: \"kubernetes.io/projected/e8903e6e-381f-4f5c-b9c5-5242c3de2897-kube-api-access-td6wp\") pod \"machine-config-daemon-8hltg\" (UID: \"e8903e6e-381f-4f5c-b9c5-5242c3de2897\") " pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.022903 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6nld\" (UniqueName: \"kubernetes.io/projected/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-kube-api-access-v6nld\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.022919 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlshf\" (UniqueName: \"kubernetes.io/projected/0219f135-adcb-41cf-a30c-719dc8b8e8a7-kube-api-access-rlshf\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.022934 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-node-log\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.022949 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-run-ovn-kubernetes\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.022962 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovnkube-config\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.022976 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-systemd-units\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.022989 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-systemd\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023004 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-log-socket\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023019 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0219f135-adcb-41cf-a30c-719dc8b8e8a7-system-cni-dir\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023033 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-openvswitch\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023052 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023088 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovnkube-script-lib\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023112 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8903e6e-381f-4f5c-b9c5-5242c3de2897-proxy-tls\") pod \"machine-config-daemon-8hltg\" (UID: \"e8903e6e-381f-4f5c-b9c5-5242c3de2897\") " pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023136 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-cni-bin\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023153 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovn-node-metrics-cert\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023168 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0219f135-adcb-41cf-a30c-719dc8b8e8a7-os-release\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023182 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0219f135-adcb-41cf-a30c-719dc8b8e8a7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023195 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8903e6e-381f-4f5c-b9c5-5242c3de2897-mcd-auth-proxy-config\") pod \"machine-config-daemon-8hltg\" (UID: \"e8903e6e-381f-4f5c-b9c5-5242c3de2897\") " pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023209 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0219f135-adcb-41cf-a30c-719dc8b8e8a7-cnibin\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023222 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0219f135-adcb-41cf-a30c-719dc8b8e8a7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023234 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-var-lib-openvswitch\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023250 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0219f135-adcb-41cf-a30c-719dc8b8e8a7-cni-binary-copy\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023284 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-ovn\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023300 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-cni-netd\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.023313 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-slash\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.035117 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.053161 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55ee569c1b045fa148dd3b476f5a562eb858e4d54cc3599564ee9def4d17e4da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:02Z\\\",\\\"message\\\":\\\"W1001 12:37:51.810798 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 12:37:51.811167 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759322271 cert, and key in /tmp/serving-cert-1778987673/serving-signer.crt, /tmp/serving-cert-1778987673/serving-signer.key\\\\nI1001 12:37:52.095398 1 observer_polling.go:159] Starting file observer\\\\nW1001 12:37:52.097151 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 12:37:52.097413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:52.098629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1778987673/tls.crt::/tmp/serving-cert-1778987673/tls.key\\\\\\\"\\\\nF1001 12:38:02.535255 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.067068 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.076833 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.090210 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.103019 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.116171 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.123884 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-etc-openvswitch\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.123917 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-run-netns\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.123931 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e8903e6e-381f-4f5c-b9c5-5242c3de2897-rootfs\") pod \"machine-config-daemon-8hltg\" (UID: \"e8903e6e-381f-4f5c-b9c5-5242c3de2897\") " pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.123946 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td6wp\" (UniqueName: \"kubernetes.io/projected/e8903e6e-381f-4f5c-b9c5-5242c3de2897-kube-api-access-td6wp\") pod \"machine-config-daemon-8hltg\" (UID: \"e8903e6e-381f-4f5c-b9c5-5242c3de2897\") " pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.123973 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6nld\" (UniqueName: \"kubernetes.io/projected/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-kube-api-access-v6nld\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.123996 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlshf\" (UniqueName: \"kubernetes.io/projected/0219f135-adcb-41cf-a30c-719dc8b8e8a7-kube-api-access-rlshf\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124016 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-node-log\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124030 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-run-ovn-kubernetes\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124044 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovnkube-config\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124060 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-systemd-units\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124074 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-systemd\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124070 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-etc-openvswitch\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124112 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-log-socket\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124089 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-log-socket\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124165 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0219f135-adcb-41cf-a30c-719dc8b8e8a7-system-cni-dir\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124184 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-openvswitch\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124190 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e8903e6e-381f-4f5c-b9c5-5242c3de2897-rootfs\") pod \"machine-config-daemon-8hltg\" (UID: \"e8903e6e-381f-4f5c-b9c5-5242c3de2897\") " pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124204 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124225 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovnkube-script-lib\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124226 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-run-ovn-kubernetes\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124315 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8903e6e-381f-4f5c-b9c5-5242c3de2897-proxy-tls\") pod \"machine-config-daemon-8hltg\" (UID: \"e8903e6e-381f-4f5c-b9c5-5242c3de2897\") " pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124337 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-cni-bin\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.123997 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-run-netns\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124359 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovn-node-metrics-cert\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124389 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0219f135-adcb-41cf-a30c-719dc8b8e8a7-os-release\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124405 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0219f135-adcb-41cf-a30c-719dc8b8e8a7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124430 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8903e6e-381f-4f5c-b9c5-5242c3de2897-mcd-auth-proxy-config\") pod \"machine-config-daemon-8hltg\" (UID: \"e8903e6e-381f-4f5c-b9c5-5242c3de2897\") " pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124447 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0219f135-adcb-41cf-a30c-719dc8b8e8a7-cnibin\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124463 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0219f135-adcb-41cf-a30c-719dc8b8e8a7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124480 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-var-lib-openvswitch\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124495 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0219f135-adcb-41cf-a30c-719dc8b8e8a7-cni-binary-copy\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124511 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-ovn\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124526 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-cni-netd\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124540 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-slash\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124562 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-env-overrides\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124577 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-kubelet\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124616 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-kubelet\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124639 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-node-log\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124677 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0219f135-adcb-41cf-a30c-719dc8b8e8a7-os-release\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124770 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovnkube-config\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124805 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-systemd-units\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.124828 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-systemd\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.125140 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0219f135-adcb-41cf-a30c-719dc8b8e8a7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.125173 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-ovn\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.125197 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-cni-netd\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.125208 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0219f135-adcb-41cf-a30c-719dc8b8e8a7-cni-binary-copy\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.125217 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-slash\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.125280 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0219f135-adcb-41cf-a30c-719dc8b8e8a7-cnibin\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.125331 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0219f135-adcb-41cf-a30c-719dc8b8e8a7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.125354 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-var-lib-openvswitch\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.125373 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-openvswitch\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.125393 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0219f135-adcb-41cf-a30c-719dc8b8e8a7-system-cni-dir\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.125446 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-cni-bin\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.125467 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.125522 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-env-overrides\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.125950 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovnkube-script-lib\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.128220 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovn-node-metrics-cert\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.131563 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.142155 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6nld\" (UniqueName: \"kubernetes.io/projected/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-kube-api-access-v6nld\") pod \"ovnkube-node-57qvb\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.143339 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.144850 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlshf\" (UniqueName: \"kubernetes.io/projected/0219f135-adcb-41cf-a30c-719dc8b8e8a7-kube-api-access-rlshf\") pod \"multus-additional-cni-plugins-7v2m5\" (UID: \"0219f135-adcb-41cf-a30c-719dc8b8e8a7\") " pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.157624 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.168833 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.189118 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.201820 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.213218 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.227531 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.240978 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.243324 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: W1001 12:38:10.251243 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6e6adf1_250b_4f6a_94da_8e3ad2cee3bd.slice/crio-ecf3d54b60b95b036bd08d19c112cecf401544bb6883215bcb0f7322a1c89609 WatchSource:0}: Error finding container ecf3d54b60b95b036bd08d19c112cecf401544bb6883215bcb0f7322a1c89609: Status 404 returned error can't find the container with id ecf3d54b60b95b036bd08d19c112cecf401544bb6883215bcb0f7322a1c89609 Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.251896 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.261610 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: W1001 12:38:10.261715 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0219f135_adcb_41cf_a30c_719dc8b8e8a7.slice/crio-5e2bb90c4581ed77c746da6fc35f37964887f51311a94317c90357ddd537c8de WatchSource:0}: Error finding container 5e2bb90c4581ed77c746da6fc35f37964887f51311a94317c90357ddd537c8de: Status 404 returned error can't find the container with id 5e2bb90c4581ed77c746da6fc35f37964887f51311a94317c90357ddd537c8de Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.288201 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.805912 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:10 crc kubenswrapper[4913]: E1001 12:38:10.806041 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.867678 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.872642 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.878222 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8903e6e-381f-4f5c-b9c5-5242c3de2897-proxy-tls\") pod \"machine-config-daemon-8hltg\" (UID: \"e8903e6e-381f-4f5c-b9c5-5242c3de2897\") " pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.939472 4913 generic.go:334] "Generic (PLEG): container finished" podID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerID="913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a" exitCode=0 Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.939544 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerDied","Data":"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a"} Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.939574 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerStarted","Data":"ecf3d54b60b95b036bd08d19c112cecf401544bb6883215bcb0f7322a1c89609"} Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.941016 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18"} Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.942395 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t7565" event={"ID":"0cfb3767-c920-41f1-9c7b-88828a9a4ba4","Type":"ContainerStarted","Data":"ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a"} Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.944253 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqn52" event={"ID":"b2420adf-64bd-4d67-ac95-9337ed10149a","Type":"ContainerStarted","Data":"5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0"} Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.948186 4913 generic.go:334] "Generic (PLEG): container finished" podID="0219f135-adcb-41cf-a30c-719dc8b8e8a7" containerID="2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd" exitCode=0 Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.948251 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" event={"ID":"0219f135-adcb-41cf-a30c-719dc8b8e8a7","Type":"ContainerDied","Data":"2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd"} Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.948302 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" event={"ID":"0219f135-adcb-41cf-a30c-719dc8b8e8a7","Type":"ContainerStarted","Data":"5e2bb90c4581ed77c746da6fc35f37964887f51311a94317c90357ddd537c8de"} Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.959030 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.969826 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.978336 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4913]: I1001 12:38:10.988643 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.000484 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.011981 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.028780 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.040994 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.052763 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.066842 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.079230 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.092803 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.108017 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.119870 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.125604 4913 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.125696 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8903e6e-381f-4f5c-b9c5-5242c3de2897-mcd-auth-proxy-config podName:e8903e6e-381f-4f5c-b9c5-5242c3de2897 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:11.625675669 +0000 UTC m=+23.529151257 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e8903e6e-381f-4f5c-b9c5-5242c3de2897-mcd-auth-proxy-config") pod "machine-config-daemon-8hltg" (UID: "e8903e6e-381f-4f5c-b9c5-5242c3de2897") : failed to sync configmap cache: timed out waiting for the condition Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.130247 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.141342 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.141591 4913 projected.go:288] Couldn't get configMap openshift-machine-config-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.141638 4913 projected.go:194] Error preparing data for projected volume kube-api-access-td6wp for pod openshift-machine-config-operator/machine-config-daemon-8hltg: failed to sync configmap cache: timed out waiting for the condition Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.141700 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8903e6e-381f-4f5c-b9c5-5242c3de2897-kube-api-access-td6wp podName:e8903e6e-381f-4f5c-b9c5-5242c3de2897 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:11.641682572 +0000 UTC m=+23.545158150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-td6wp" (UniqueName: "kubernetes.io/projected/e8903e6e-381f-4f5c-b9c5-5242c3de2897-kube-api-access-td6wp") pod "machine-config-daemon-8hltg" (UID: "e8903e6e-381f-4f5c-b9c5-5242c3de2897") : failed to sync configmap cache: timed out waiting for the condition Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.152540 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.164512 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.184280 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.197172 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.208688 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.218556 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.229017 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.248974 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.252126 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.372753 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.408007 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-z8555"] Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.408418 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z8555" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.410672 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.410835 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.410938 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.411088 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.424929 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.437717 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.458526 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.473933 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.501950 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.513955 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.531443 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.535080 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/23f43f91-4983-4348-926f-4dbcafbbaa18-serviceca\") pod \"node-ca-z8555\" (UID: \"23f43f91-4983-4348-926f-4dbcafbbaa18\") " pod="openshift-image-registry/node-ca-z8555" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.535120 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23f43f91-4983-4348-926f-4dbcafbbaa18-host\") pod \"node-ca-z8555\" (UID: \"23f43f91-4983-4348-926f-4dbcafbbaa18\") " pod="openshift-image-registry/node-ca-z8555" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.535163 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z6lv\" (UniqueName: \"kubernetes.io/projected/23f43f91-4983-4348-926f-4dbcafbbaa18-kube-api-access-7z6lv\") pod \"node-ca-z8555\" (UID: \"23f43f91-4983-4348-926f-4dbcafbbaa18\") " pod="openshift-image-registry/node-ca-z8555" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.540746 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.553172 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.564509 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.574824 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.585799 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.597651 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.608793 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.635567 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8903e6e-381f-4f5c-b9c5-5242c3de2897-mcd-auth-proxy-config\") pod \"machine-config-daemon-8hltg\" (UID: \"e8903e6e-381f-4f5c-b9c5-5242c3de2897\") " pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.635692 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23f43f91-4983-4348-926f-4dbcafbbaa18-host\") pod \"node-ca-z8555\" (UID: \"23f43f91-4983-4348-926f-4dbcafbbaa18\") " pod="openshift-image-registry/node-ca-z8555" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.635776 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z6lv\" (UniqueName: \"kubernetes.io/projected/23f43f91-4983-4348-926f-4dbcafbbaa18-kube-api-access-7z6lv\") pod \"node-ca-z8555\" (UID: \"23f43f91-4983-4348-926f-4dbcafbbaa18\") " pod="openshift-image-registry/node-ca-z8555" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.635891 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/23f43f91-4983-4348-926f-4dbcafbbaa18-serviceca\") pod \"node-ca-z8555\" (UID: \"23f43f91-4983-4348-926f-4dbcafbbaa18\") " pod="openshift-image-registry/node-ca-z8555" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.635806 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23f43f91-4983-4348-926f-4dbcafbbaa18-host\") pod \"node-ca-z8555\" (UID: \"23f43f91-4983-4348-926f-4dbcafbbaa18\") " pod="openshift-image-registry/node-ca-z8555" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.636202 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8903e6e-381f-4f5c-b9c5-5242c3de2897-mcd-auth-proxy-config\") pod \"machine-config-daemon-8hltg\" (UID: \"e8903e6e-381f-4f5c-b9c5-5242c3de2897\") " pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.636803 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/23f43f91-4983-4348-926f-4dbcafbbaa18-serviceca\") pod \"node-ca-z8555\" (UID: \"23f43f91-4983-4348-926f-4dbcafbbaa18\") " pod="openshift-image-registry/node-ca-z8555" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.650816 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z6lv\" (UniqueName: \"kubernetes.io/projected/23f43f91-4983-4348-926f-4dbcafbbaa18-kube-api-access-7z6lv\") pod \"node-ca-z8555\" (UID: \"23f43f91-4983-4348-926f-4dbcafbbaa18\") " pod="openshift-image-registry/node-ca-z8555" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.723926 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z8555" Oct 01 12:38:11 crc kubenswrapper[4913]: W1001 12:38:11.734206 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23f43f91_4983_4348_926f_4dbcafbbaa18.slice/crio-98c4eb1dc1a3c3b26f6aa5f9333b9384cd9ed9dcbb6b1141c140e6f7c7e23016 WatchSource:0}: Error finding container 98c4eb1dc1a3c3b26f6aa5f9333b9384cd9ed9dcbb6b1141c140e6f7c7e23016: Status 404 returned error can't find the container with id 98c4eb1dc1a3c3b26f6aa5f9333b9384cd9ed9dcbb6b1141c140e6f7c7e23016 Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.736367 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.736552 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.736580 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:38:15.73654947 +0000 UTC m=+27.640025068 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.736615 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td6wp\" (UniqueName: \"kubernetes.io/projected/e8903e6e-381f-4f5c-b9c5-5242c3de2897-kube-api-access-td6wp\") pod \"machine-config-daemon-8hltg\" (UID: \"e8903e6e-381f-4f5c-b9c5-5242c3de2897\") " pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.736652 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.736704 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.736743 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.736810 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.736829 4913 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.736839 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.736873 4913 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.736892 4913 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.736880 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:15.736865599 +0000 UTC m=+27.640341177 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.736948 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:15.73692907 +0000 UTC m=+27.640404708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.736982 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:15.736961001 +0000 UTC m=+27.640436669 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.737073 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.737147 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.737159 4913 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.737201 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:15.737188318 +0000 UTC m=+27.640664006 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.739480 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td6wp\" (UniqueName: \"kubernetes.io/projected/e8903e6e-381f-4f5c-b9c5-5242c3de2897-kube-api-access-td6wp\") pod \"machine-config-daemon-8hltg\" (UID: \"e8903e6e-381f-4f5c-b9c5-5242c3de2897\") " pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.753046 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.753886 4913 scope.go:117] "RemoveContainer" containerID="ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b" Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.754066 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.758809 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:38:11 crc kubenswrapper[4913]: W1001 12:38:11.770327 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8903e6e_381f_4f5c_b9c5_5242c3de2897.slice/crio-4a162fc8ca6e89bec727f9b5fce07ea04ae3bd796535f48bd562ce81b2efcdb2 WatchSource:0}: Error finding container 4a162fc8ca6e89bec727f9b5fce07ea04ae3bd796535f48bd562ce81b2efcdb2: Status 404 returned error can't find the container with id 4a162fc8ca6e89bec727f9b5fce07ea04ae3bd796535f48bd562ce81b2efcdb2 Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.806434 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.806566 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.806522 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:11 crc kubenswrapper[4913]: E1001 12:38:11.807086 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.956997 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerStarted","Data":"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a"} Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.957040 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerStarted","Data":"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a"} Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.957051 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerStarted","Data":"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8"} Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.957059 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerStarted","Data":"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1"} Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.957069 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerStarted","Data":"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036"} Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.957078 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerStarted","Data":"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f"} Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.960185 4913 generic.go:334] "Generic (PLEG): container finished" podID="0219f135-adcb-41cf-a30c-719dc8b8e8a7" containerID="b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c" exitCode=0 Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.960473 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" event={"ID":"0219f135-adcb-41cf-a30c-719dc8b8e8a7","Type":"ContainerDied","Data":"b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c"} Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.961853 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a"} Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.961905 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"4a162fc8ca6e89bec727f9b5fce07ea04ae3bd796535f48bd562ce81b2efcdb2"} Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.966655 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z8555" event={"ID":"23f43f91-4983-4348-926f-4dbcafbbaa18","Type":"ContainerStarted","Data":"98c4eb1dc1a3c3b26f6aa5f9333b9384cd9ed9dcbb6b1141c140e6f7c7e23016"} Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.977723 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.988477 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:11 crc kubenswrapper[4913]: I1001 12:38:11.998034 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.009651 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.023906 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.042192 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.054289 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.065924 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.078200 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.089645 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.099836 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.112290 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.121198 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.809892 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:12 crc kubenswrapper[4913]: E1001 12:38:12.810375 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.971176 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z8555" event={"ID":"23f43f91-4983-4348-926f-4dbcafbbaa18","Type":"ContainerStarted","Data":"3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec"} Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.974296 4913 generic.go:334] "Generic (PLEG): container finished" podID="0219f135-adcb-41cf-a30c-719dc8b8e8a7" containerID="e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf" exitCode=0 Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.974366 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" event={"ID":"0219f135-adcb-41cf-a30c-719dc8b8e8a7","Type":"ContainerDied","Data":"e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf"} Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.982677 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e"} Oct 01 12:38:12 crc kubenswrapper[4913]: I1001 12:38:12.994840 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.009797 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.030732 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.041597 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.052661 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.069851 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.080316 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.095117 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.107238 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.119384 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.131092 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.142617 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.153329 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.170471 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.189573 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.201159 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.212288 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.224015 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.236731 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.245451 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.256906 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.264909 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.274443 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.287190 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.300098 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.311370 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.352083 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.357046 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.360200 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.362978 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.375194 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.387866 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.399153 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.411679 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.425701 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.438278 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.448805 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.459076 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.472178 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.484077 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.495315 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.513503 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.523788 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.556938 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.603982 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.641019 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.677211 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.716007 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.760697 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.796743 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.806392 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:13 crc kubenswrapper[4913]: E1001 12:38:13.806487 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.806400 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:13 crc kubenswrapper[4913]: E1001 12:38:13.806553 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.842132 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.878437 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.918224 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.957753 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.989762 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerStarted","Data":"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7"} Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.991964 4913 generic.go:334] "Generic (PLEG): container finished" podID="0219f135-adcb-41cf-a30c-719dc8b8e8a7" containerID="c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a" exitCode=0 Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.992132 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" event={"ID":"0219f135-adcb-41cf-a30c-719dc8b8e8a7","Type":"ContainerDied","Data":"c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a"} Oct 01 12:38:13 crc kubenswrapper[4913]: I1001 12:38:13.999775 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.041890 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.081649 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.123956 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.162818 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.196426 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.237159 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.279059 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.315465 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.361182 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.383414 4913 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.385557 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.385589 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.385599 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.385690 4913 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.398709 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.458785 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.471003 4913 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.471230 4913 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.472412 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.472446 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.472455 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.472469 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.472479 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4913]: E1001 12:38:14.489433 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.491999 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.492040 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.492053 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.492072 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.492084 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4913]: E1001 12:38:14.503819 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.506638 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.506670 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.506681 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.506698 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.506711 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.515984 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: E1001 12:38:14.520395 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.525559 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.525584 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.525592 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.525607 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.525616 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4913]: E1001 12:38:14.540443 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.544519 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.544555 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.544563 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.544578 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.544588 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4913]: E1001 12:38:14.556189 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: E1001 12:38:14.556344 4913 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.557529 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.558128 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.558155 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.558164 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.558178 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.558188 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.596608 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.634936 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.660655 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.660701 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.660715 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.660741 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.660755 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.763297 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.763326 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.763335 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.763348 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.763359 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.806255 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:14 crc kubenswrapper[4913]: E1001 12:38:14.806433 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.866710 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.866751 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.866761 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.866779 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.866790 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.969374 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.969452 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.969474 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.969505 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.969527 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.999131 4913 generic.go:334] "Generic (PLEG): container finished" podID="0219f135-adcb-41cf-a30c-719dc8b8e8a7" containerID="8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb" exitCode=0 Oct 01 12:38:14 crc kubenswrapper[4913]: I1001 12:38:14.999172 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" event={"ID":"0219f135-adcb-41cf-a30c-719dc8b8e8a7","Type":"ContainerDied","Data":"8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb"} Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.013048 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:15Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.023392 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:15Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.035870 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:15Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.048086 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:15Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.060615 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:15Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.073339 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:15Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.077612 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.077640 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.077649 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.077663 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.077673 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.096671 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:15Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.115176 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:15Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.127937 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:15Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.140820 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:15Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.153691 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:15Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.171885 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:15Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.179618 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.179653 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.179670 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.179684 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.179693 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.183010 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:15Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.198488 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:15Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.282731 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.282767 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.282775 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.282791 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.282804 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.385343 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.385377 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.385385 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.385398 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.385408 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.488246 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.488336 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.488347 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.488363 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.488375 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.590590 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.590654 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.590671 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.590694 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.590712 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.693072 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.693112 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.693127 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.693143 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.693153 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.775634 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.775821 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:38:23.775783064 +0000 UTC m=+35.679258642 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.775790 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.775874 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.775900 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.775917 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.775926 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.775945 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.775957 4913 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.775977 4913 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.776005 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:23.775984219 +0000 UTC m=+35.679459797 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.776023 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:23.77601558 +0000 UTC m=+35.679491158 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.776086 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.776123 4913 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.776130 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.776225 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:23.776205075 +0000 UTC m=+35.679680653 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.776233 4913 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.776345 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:23.776317698 +0000 UTC m=+35.679793326 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.795888 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.795934 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.795946 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.795964 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.795977 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.805772 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.805828 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.805918 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:15 crc kubenswrapper[4913]: E1001 12:38:15.805967 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.898315 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.898354 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.898365 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.898380 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4913]: I1001 12:38:15.898391 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.000815 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.000878 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.000896 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.000927 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.000947 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.035947 4913 generic.go:334] "Generic (PLEG): container finished" podID="0219f135-adcb-41cf-a30c-719dc8b8e8a7" containerID="cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971" exitCode=0 Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.035996 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" event={"ID":"0219f135-adcb-41cf-a30c-719dc8b8e8a7","Type":"ContainerDied","Data":"cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971"} Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.056568 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:16Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.073513 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:16Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.085555 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:16Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.097643 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:16Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.103158 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.103213 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.103233 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.103252 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.103280 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.114375 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:16Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.135764 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:16Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.149287 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:16Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.163919 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:16Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.176594 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:16Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.191815 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:16Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.203750 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:16Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.205590 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.205647 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.205659 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.205679 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.205692 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.220379 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:16Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.236444 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:16Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.248232 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:16Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.308397 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.308428 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.308438 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.308454 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.308462 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.411669 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.411711 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.411721 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.411737 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.411748 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.514878 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.514903 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.514913 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.514926 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.514936 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.618070 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.618361 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.618370 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.618382 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.618391 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.720815 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.720876 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.720897 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.720921 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.720938 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.806036 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:16 crc kubenswrapper[4913]: E1001 12:38:16.806200 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.823770 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.823804 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.823812 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.823824 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.823834 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.927259 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.927367 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.927390 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.927421 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4913]: I1001 12:38:16.927445 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.029734 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.029800 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.029817 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.029843 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.029860 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.046445 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerStarted","Data":"8406589a2c09c53554c772174d3c97c4b6499e3692a733f3a02de48feb46bf6a"} Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.046541 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.052973 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" event={"ID":"0219f135-adcb-41cf-a30c-719dc8b8e8a7","Type":"ContainerStarted","Data":"a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c"} Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.062011 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.077738 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.088646 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8406589a2c09c53554c772174d3c97c4b6499e3692a733f3a02de48feb46bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.107574 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.127900 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.133190 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.133293 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.133314 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.133344 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.133401 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.144852 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.161662 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.181511 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.195514 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.216094 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.233823 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.236642 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.236668 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.236679 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.236696 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.236708 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.252087 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.266763 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.281113 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.295915 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.312049 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.329582 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.339334 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.339367 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.339377 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.339392 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.339403 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.349928 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.359879 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.376873 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.388783 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.404879 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.418750 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.430121 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.441907 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.441951 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.441964 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.441983 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.441999 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.445597 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.457993 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.476174 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.487909 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.508818 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8406589a2c09c53554c772174d3c97c4b6499e3692a733f3a02de48feb46bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.544838 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.544943 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.544967 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.545010 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.545037 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.647474 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.647516 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.647527 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.647543 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.647555 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.750542 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.750614 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.750631 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.750654 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.750673 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.806253 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.806332 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:17 crc kubenswrapper[4913]: E1001 12:38:17.806453 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:17 crc kubenswrapper[4913]: E1001 12:38:17.806550 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.854073 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.854118 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.854129 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.854146 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.854160 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.956287 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.956320 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.956328 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.956341 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4913]: I1001 12:38:17.956350 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.056810 4913 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.057341 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.058104 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.058171 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.058185 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.058202 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.058237 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.123014 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.135392 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.151289 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.161022 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.161077 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.161107 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.161133 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.161153 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.165660 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.180372 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.195954 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.210664 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.233859 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8406589a2c09c53554c772174d3c97c4b6499e3692a733f3a02de48feb46bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.247869 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.258926 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.263315 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.263349 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.263358 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.263373 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.263382 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.271750 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.282368 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.298243 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.313208 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.329558 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.366180 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.366217 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.366227 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.366243 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.366253 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.468423 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.468466 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.468482 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.468501 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.468515 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.570796 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.571288 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.571393 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.571493 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.571576 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.674582 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.674636 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.674654 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.674678 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.674695 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.777329 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.777397 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.777427 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.777462 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.777488 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.806062 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:18 crc kubenswrapper[4913]: E1001 12:38:18.806330 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.831701 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.848406 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.861744 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.880326 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.880356 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.880367 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.880382 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.880393 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.885722 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.904091 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.915235 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.931931 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8406589a2c09c53554c772174d3c97c4b6499e3692a733f3a02de48feb46bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.943000 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.957405 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.970700 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.982416 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.983666 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.983771 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.983830 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.983936 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.984005 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4913]: I1001 12:38:18.997071 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.013524 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.026864 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.062415 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/0.log" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.065569 4913 generic.go:334] "Generic (PLEG): container finished" podID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerID="8406589a2c09c53554c772174d3c97c4b6499e3692a733f3a02de48feb46bf6a" exitCode=1 Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.065649 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerDied","Data":"8406589a2c09c53554c772174d3c97c4b6499e3692a733f3a02de48feb46bf6a"} Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.066804 4913 scope.go:117] "RemoveContainer" containerID="8406589a2c09c53554c772174d3c97c4b6499e3692a733f3a02de48feb46bf6a" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.079295 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.086489 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.086524 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.086535 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.086555 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.086565 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.098289 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8406589a2c09c53554c772174d3c97c4b6499e3692a733f3a02de48feb46bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8406589a2c09c53554c772174d3c97c4b6499e3692a733f3a02de48feb46bf6a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\".008680 6208 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:38:19.008688 6208 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:38:19.008707 6208 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 12:38:19.008870 6208 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:38:19.009202 6208 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 12:38:19.009249 6208 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:38:19.009317 6208 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:38:19.010345 6208 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 12:38:19.010418 6208 factory.go:656] Stopping watch factory\\\\nI1001 12:38:19.010449 6208 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:38:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.111015 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.121770 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.133528 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.143638 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.155441 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.167446 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.178625 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.188301 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.188331 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.188341 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.188356 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.188365 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.192455 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.204752 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.215439 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.226137 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.239476 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.290524 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.290570 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.290582 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.290599 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.290612 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.393114 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.393165 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.393183 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.393204 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.393225 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.495693 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.495727 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.495736 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.495749 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.495761 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.597800 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.597835 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.597844 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.597859 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.597875 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.699743 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.699804 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.699817 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.699835 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.699848 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.801544 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.801580 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.801588 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.801603 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.801616 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.805777 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.805790 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:19 crc kubenswrapper[4913]: E1001 12:38:19.806000 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:19 crc kubenswrapper[4913]: E1001 12:38:19.805909 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.903642 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.903688 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.903701 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.903719 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4913]: I1001 12:38:19.903864 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.006604 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.006645 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.006655 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.006671 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.006681 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.071319 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/1.log" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.073547 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/0.log" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.076365 4913 generic.go:334] "Generic (PLEG): container finished" podID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerID="cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7" exitCode=1 Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.076420 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerDied","Data":"cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7"} Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.076465 4913 scope.go:117] "RemoveContainer" containerID="8406589a2c09c53554c772174d3c97c4b6499e3692a733f3a02de48feb46bf6a" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.077037 4913 scope.go:117] "RemoveContainer" containerID="cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7" Oct 01 12:38:20 crc kubenswrapper[4913]: E1001 12:38:20.077199 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.091495 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.106465 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.108954 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.109012 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.109025 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.109041 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.109051 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.127335 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8406589a2c09c53554c772174d3c97c4b6499e3692a733f3a02de48feb46bf6a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\".008680 6208 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:38:19.008688 6208 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:38:19.008707 6208 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 12:38:19.008870 6208 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:38:19.009202 6208 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 12:38:19.009249 6208 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:38:19.009317 6208 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:38:19.010345 6208 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 12:38:19.010418 6208 factory.go:656] Stopping watch factory\\\\nI1001 12:38:19.010449 6208 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:38:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ing metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 12:38:19.793164 6348 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 12:38:19.793204 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.138511 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.149547 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.163403 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.173086 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.186195 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.197992 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.211041 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.211076 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.211087 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.211106 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.211120 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.211754 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.223361 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.235252 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.249908 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.262149 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.313791 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.313848 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.313869 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.313895 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.313916 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.416884 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.417309 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.417468 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.417674 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.417818 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.521110 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.521154 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.521167 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.521184 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.521196 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.624005 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.624037 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.624048 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.624063 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.624075 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.726707 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.726770 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.726792 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.726820 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.726840 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.805994 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:20 crc kubenswrapper[4913]: E1001 12:38:20.806455 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.829192 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.829250 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.829304 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.829335 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.829356 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.932537 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.932602 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.932621 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.932644 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4913]: I1001 12:38:20.932664 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.034813 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.034862 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.034873 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.034893 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.034906 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.083601 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/1.log" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.096602 4913 scope.go:117] "RemoveContainer" containerID="cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7" Oct 01 12:38:21 crc kubenswrapper[4913]: E1001 12:38:21.097032 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.110214 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.128130 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.137632 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.137833 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.137920 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.137986 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.138062 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.144176 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.159650 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.174333 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.192598 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.212109 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.231668 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.241730 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.241768 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.241779 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.241796 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.241807 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.263696 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.287474 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.304619 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.316202 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.328122 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.344155 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.344183 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.344190 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.344203 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.344211 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.354181 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ing metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 12:38:19.793164 6348 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 12:38:19.793204 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.447012 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.447307 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.447573 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.447812 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.448040 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.550575 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.550642 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.550663 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.550688 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.550708 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.653669 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.653717 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.653730 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.653750 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.653762 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.755464 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.755525 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.755548 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.755575 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.755592 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.806171 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.806242 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:21 crc kubenswrapper[4913]: E1001 12:38:21.806354 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:21 crc kubenswrapper[4913]: E1001 12:38:21.806448 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.819567 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj"] Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.819949 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.821740 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.821948 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.838227 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.857956 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.857990 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.858000 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.858030 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.858043 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.859603 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ing metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 12:38:19.793164 6348 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 12:38:19.793204 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.874903 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.887769 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.905331 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.918765 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.929681 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.932573 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgd7x\" (UniqueName: \"kubernetes.io/projected/687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb-kube-api-access-fgd7x\") pod \"ovnkube-control-plane-749d76644c-zbmtj\" (UID: \"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.932624 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zbmtj\" (UID: \"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.932644 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zbmtj\" (UID: \"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.932772 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zbmtj\" (UID: \"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.942111 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.958798 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.960374 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.960521 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.960590 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.960664 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.960747 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.974551 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.984388 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4913]: I1001 12:38:21.996314 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.006019 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.020676 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.033638 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zbmtj\" (UID: \"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.033682 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgd7x\" (UniqueName: \"kubernetes.io/projected/687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb-kube-api-access-fgd7x\") pod \"ovnkube-control-plane-749d76644c-zbmtj\" (UID: \"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.033741 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zbmtj\" (UID: \"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.033767 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zbmtj\" (UID: \"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.034786 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zbmtj\" (UID: \"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.035065 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zbmtj\" (UID: \"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.038630 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.042004 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zbmtj\" (UID: \"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.051221 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgd7x\" (UniqueName: \"kubernetes.io/projected/687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb-kube-api-access-fgd7x\") pod \"ovnkube-control-plane-749d76644c-zbmtj\" (UID: \"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.062868 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.062926 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.062944 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.062970 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.062988 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.136655 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" Oct 01 12:38:22 crc kubenswrapper[4913]: W1001 12:38:22.154503 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687c12dc_5e8c_41f7_a962_c0b2cdd2a3cb.slice/crio-dfb4712c6aa0d350ebf7c74f05cee9febf57e6b3b13833dda6bcfbfecc542f93 WatchSource:0}: Error finding container dfb4712c6aa0d350ebf7c74f05cee9febf57e6b3b13833dda6bcfbfecc542f93: Status 404 returned error can't find the container with id dfb4712c6aa0d350ebf7c74f05cee9febf57e6b3b13833dda6bcfbfecc542f93 Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.165287 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.165330 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.165342 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.165358 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.165372 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.268682 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.268738 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.268754 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.268781 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.268800 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.371838 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.371885 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.371896 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.371915 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.371928 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.474012 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.474061 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.474076 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.474098 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.474113 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.575945 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.576005 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.576023 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.576048 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.576067 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.679291 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.679352 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.679364 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.679383 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.679399 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.781851 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.781917 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.781936 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.781963 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.781981 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.806197 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:22 crc kubenswrapper[4913]: E1001 12:38:22.806385 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.884895 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.884965 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.884982 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.885023 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.885040 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.987580 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.987621 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.987630 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.987647 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4913]: I1001 12:38:22.987657 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.090463 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.090553 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.090578 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.090608 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.090632 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.103773 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" event={"ID":"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb","Type":"ContainerStarted","Data":"bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83"} Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.103853 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" event={"ID":"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb","Type":"ContainerStarted","Data":"702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94"} Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.103876 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" event={"ID":"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb","Type":"ContainerStarted","Data":"dfb4712c6aa0d350ebf7c74f05cee9febf57e6b3b13833dda6bcfbfecc542f93"} Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.125362 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.142796 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.182649 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ing metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 12:38:19.793164 6348 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 12:38:19.793204 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.194242 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.194325 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.194340 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.194366 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.194381 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.199301 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.221749 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.244729 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.260041 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.275786 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.290888 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.297172 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.297322 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.297407 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.297489 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.297586 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.309286 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.321600 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.342347 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.362698 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.377428 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.395452 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.400179 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.400349 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.400470 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.400596 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.400761 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.503398 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.503647 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.503740 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.503822 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.503904 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.606428 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.606472 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.606489 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.606508 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.606519 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.709155 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.709244 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.709295 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.709325 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.709348 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.718389 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8c8wp"] Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.719048 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.719181 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.732377 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.751171 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ing metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 12:38:19.793164 6348 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 12:38:19.793204 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.766572 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.787221 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.805650 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.805735 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.805779 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.806450 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.806305 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.811761 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.811829 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.811853 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.811882 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.811900 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.821871 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.838705 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.853446 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.853607 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:38:39.853585351 +0000 UTC m=+51.757060929 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.853905 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.853955 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kfdg\" (UniqueName: \"kubernetes.io/projected/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-kube-api-access-5kfdg\") pod \"network-metrics-daemon-8c8wp\" (UID: \"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\") " pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.853987 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.854011 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.854035 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs\") pod \"network-metrics-daemon-8c8wp\" (UID: \"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\") " pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.854068 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.854126 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.854159 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.854172 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.854199 4913 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.854223 4913 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.854262 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:39.854244539 +0000 UTC m=+51.757720147 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.854308 4913 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.854335 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:39.854314851 +0000 UTC m=+51.757790469 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.854175 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.854356 4913 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.854362 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:39.854348222 +0000 UTC m=+51.757823830 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.854386 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:39.854375553 +0000 UTC m=+51.757851141 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.857443 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.882090 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.894997 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.910874 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.914938 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.915091 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.915206 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.915403 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.915543 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.928858 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.941365 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.954630 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.955011 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kfdg\" (UniqueName: \"kubernetes.io/projected/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-kube-api-access-5kfdg\") pod \"network-metrics-daemon-8c8wp\" (UID: \"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\") " pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.955096 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs\") pod \"network-metrics-daemon-8c8wp\" (UID: \"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\") " pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.955220 4913 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:23 crc kubenswrapper[4913]: E1001 12:38:23.955354 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs podName:c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:24.455325012 +0000 UTC m=+36.358800630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs") pod "network-metrics-daemon-8c8wp" (UID: "c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.967707 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.982216 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:23 crc kubenswrapper[4913]: I1001 12:38:23.985519 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kfdg\" (UniqueName: \"kubernetes.io/projected/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-kube-api-access-5kfdg\") pod \"network-metrics-daemon-8c8wp\" (UID: \"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\") " pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.017988 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.018017 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.018025 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.018037 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.018047 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.122218 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.122320 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.122365 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.122399 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.122422 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.224510 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.224550 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.224558 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.224572 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.224580 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.326676 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.326710 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.326720 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.326736 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.326746 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.428810 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.428835 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.428843 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.428858 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.428866 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.460493 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs\") pod \"network-metrics-daemon-8c8wp\" (UID: \"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\") " pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:24 crc kubenswrapper[4913]: E1001 12:38:24.460602 4913 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:24 crc kubenswrapper[4913]: E1001 12:38:24.460707 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs podName:c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:25.460631397 +0000 UTC m=+37.364106975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs") pod "network-metrics-daemon-8c8wp" (UID: "c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.531396 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.531420 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.531429 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.531441 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.531450 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.634292 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.634336 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.634345 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.634359 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.634414 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.645233 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.645524 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.645655 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.645809 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.645933 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4913]: E1001 12:38:24.663231 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.667435 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.667516 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.667530 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.667545 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.667555 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4913]: E1001 12:38:24.682557 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.687367 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.687410 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.687418 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.687433 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.687443 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4913]: E1001 12:38:24.700479 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.703913 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.703946 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.703985 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.704004 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.704015 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4913]: E1001 12:38:24.719649 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.722564 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.722598 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.722609 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.722623 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.722635 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4913]: E1001 12:38:24.734807 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:24 crc kubenswrapper[4913]: E1001 12:38:24.734960 4913 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.736260 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.736341 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.736351 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.736366 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.736375 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.805836 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:24 crc kubenswrapper[4913]: E1001 12:38:24.806002 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.838161 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.838203 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.838215 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.838232 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.838245 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.940002 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.940037 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.940049 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.940067 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4913]: I1001 12:38:24.940078 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.042885 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.042945 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.042959 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.042976 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.042988 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.146975 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.147031 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.147047 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.147070 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.147086 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.250480 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.250539 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.250557 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.250582 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.250599 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.354122 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.354465 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.354564 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.354683 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.354798 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.458409 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.458527 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.458555 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.458588 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.458615 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.474063 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs\") pod \"network-metrics-daemon-8c8wp\" (UID: \"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\") " pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:25 crc kubenswrapper[4913]: E1001 12:38:25.474255 4913 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:25 crc kubenswrapper[4913]: E1001 12:38:25.474371 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs podName:c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:27.474348994 +0000 UTC m=+39.377824602 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs") pod "network-metrics-daemon-8c8wp" (UID: "c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.561879 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.561945 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.561963 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.561989 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.562007 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.665785 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.665853 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.665869 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.665889 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.665903 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.767853 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.767887 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.767897 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.767915 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.767928 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.806433 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.806559 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.806661 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:25 crc kubenswrapper[4913]: E1001 12:38:25.806714 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:25 crc kubenswrapper[4913]: E1001 12:38:25.806849 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:25 crc kubenswrapper[4913]: E1001 12:38:25.806980 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.870877 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.870922 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.870932 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.870952 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.870965 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.973994 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.974041 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.974053 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.974072 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4913]: I1001 12:38:25.974084 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.077218 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.077303 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.077319 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.077340 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.077355 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.179064 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.179114 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.179125 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.179144 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.179156 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.282173 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.282217 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.282228 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.282245 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.282257 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.384931 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.384974 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.384991 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.385062 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.385081 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.487218 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.487261 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.487299 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.487322 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.487336 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.589580 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.589647 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.589670 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.589699 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.589723 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.693337 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.693400 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.693422 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.693452 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.693473 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.795912 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.795984 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.796006 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.796034 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.796054 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.806504 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:26 crc kubenswrapper[4913]: E1001 12:38:26.806832 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.806979 4913 scope.go:117] "RemoveContainer" containerID="ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.898751 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.898801 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.898816 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.898838 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4913]: I1001 12:38:26.898854 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.001486 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.001925 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.002085 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.002110 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.002411 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.104917 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.104960 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.104971 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.104987 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.104997 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.118873 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.120521 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18"} Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.120890 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.136573 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.150574 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.162246 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.172774 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.183956 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.199144 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.207379 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.207532 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.207624 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.207736 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.207824 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.212497 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.227940 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ing metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 12:38:19.793164 6348 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 12:38:19.793204 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.241035 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.252778 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.265756 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.277589 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.299765 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.309947 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.310125 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.310244 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.310363 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.310452 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.312712 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.325708 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.335350 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.412569 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.412612 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.412623 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.412640 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.412650 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.494246 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs\") pod \"network-metrics-daemon-8c8wp\" (UID: \"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\") " pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:27 crc kubenswrapper[4913]: E1001 12:38:27.494509 4913 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:27 crc kubenswrapper[4913]: E1001 12:38:27.494781 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs podName:c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:31.494763113 +0000 UTC m=+43.398238691 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs") pod "network-metrics-daemon-8c8wp" (UID: "c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.514972 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.515014 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.515026 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.515045 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.515055 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.617243 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.617511 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.617663 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.617880 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.618258 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.720327 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.720594 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.720802 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.720994 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.721175 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.806119 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:27 crc kubenswrapper[4913]: E1001 12:38:27.806240 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.806134 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.806355 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:27 crc kubenswrapper[4913]: E1001 12:38:27.806378 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:27 crc kubenswrapper[4913]: E1001 12:38:27.806542 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.823018 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.823249 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.823329 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.823466 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.823555 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.925807 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.925999 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.926054 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.926122 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4913]: I1001 12:38:27.926206 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.001488 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.002194 4913 scope.go:117] "RemoveContainer" containerID="cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7" Oct 01 12:38:28 crc kubenswrapper[4913]: E1001 12:38:28.002364 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.028551 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.028586 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.028597 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.028612 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.028624 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.130631 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.130951 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.131092 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.131289 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.131422 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.234847 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.235073 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.235134 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.235195 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.235284 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.338154 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.338213 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.338230 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.338259 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.338316 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.441335 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.441387 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.441404 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.441427 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.441443 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.544328 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.544400 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.544422 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.544450 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.544470 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.647309 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.647357 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.647373 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.647393 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.647411 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.750692 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.750960 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.751101 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.751243 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.751385 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.805746 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:28 crc kubenswrapper[4913]: E1001 12:38:28.806324 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.824770 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.851619 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ing metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 12:38:19.793164 6348 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 12:38:19.793204 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.854381 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.854483 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.854498 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.854515 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.854527 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.865711 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.883339 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.904312 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.918506 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.934761 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.948702 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.957262 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.957365 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.957383 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.957408 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.957426 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.965843 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.983649 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:28 crc kubenswrapper[4913]: I1001 12:38:28.997297 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.012519 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.026164 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.039880 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.053406 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.060300 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.060363 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.060375 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.060394 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.060406 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.070260 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.162494 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.162560 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.162569 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.162583 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.162591 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.268967 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.269611 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.269647 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.269666 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.269679 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.371939 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.372004 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.372022 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.372047 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.372064 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.474231 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.474336 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.474351 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.474370 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.474382 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.576224 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.576434 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.576493 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.576556 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.576615 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.678875 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.678936 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.678951 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.678975 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.678992 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.781250 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.781327 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.781340 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.781356 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.781367 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.806130 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.806161 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.806257 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:29 crc kubenswrapper[4913]: E1001 12:38:29.806388 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:29 crc kubenswrapper[4913]: E1001 12:38:29.806506 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:29 crc kubenswrapper[4913]: E1001 12:38:29.806624 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.883251 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.883321 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.883333 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.883349 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.883359 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.985972 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.986016 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.986026 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.986040 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4913]: I1001 12:38:29.986051 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.089031 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.089089 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.089105 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.089129 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.089147 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.191305 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.191341 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.191352 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.191371 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.191382 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.294991 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.295030 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.295041 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.295057 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.295068 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.397729 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.397790 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.397813 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.397840 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.397859 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.500364 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.500403 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.500411 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.500425 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.500435 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.602764 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.602803 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.602815 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.602832 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.602845 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.707465 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.707511 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.707521 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.707538 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.707549 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.806495 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:30 crc kubenswrapper[4913]: E1001 12:38:30.806611 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.810242 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.810306 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.810315 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.810327 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.810337 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.912510 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.912738 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.912811 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.912924 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4913]: I1001 12:38:30.913009 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.015121 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.015153 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.015160 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.015175 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.015184 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.117812 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.118062 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.118125 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.118186 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.118247 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.220841 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.220869 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.220880 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.220895 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.220906 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.324028 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.324101 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.324123 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.324146 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.324164 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.432885 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.433539 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.433566 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.433595 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.433614 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.537510 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.537562 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.537579 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.537606 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.537627 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.537997 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs\") pod \"network-metrics-daemon-8c8wp\" (UID: \"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\") " pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:31 crc kubenswrapper[4913]: E1001 12:38:31.538333 4913 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:31 crc kubenswrapper[4913]: E1001 12:38:31.538434 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs podName:c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:39.53840837 +0000 UTC m=+51.441883988 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs") pod "network-metrics-daemon-8c8wp" (UID: "c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.639824 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.639946 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.639965 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.639993 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.640029 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.742542 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.742601 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.742617 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.742640 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.742657 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.806464 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.806548 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.806491 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:31 crc kubenswrapper[4913]: E1001 12:38:31.806684 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:31 crc kubenswrapper[4913]: E1001 12:38:31.806797 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:31 crc kubenswrapper[4913]: E1001 12:38:31.806914 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.845215 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.845291 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.845308 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.845326 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.845337 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.948652 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.948688 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.948695 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.948708 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4913]: I1001 12:38:31.948716 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.050838 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.050892 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.050904 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.050923 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.050937 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:32Z","lastTransitionTime":"2025-10-01T12:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.153095 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.153151 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.153186 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.153211 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.153231 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:32Z","lastTransitionTime":"2025-10-01T12:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.256214 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.256311 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.256329 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.256352 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.256369 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:32Z","lastTransitionTime":"2025-10-01T12:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.358982 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.359042 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.359060 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.359084 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.359102 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:32Z","lastTransitionTime":"2025-10-01T12:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.462164 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.462218 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.462234 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.462257 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.462302 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:32Z","lastTransitionTime":"2025-10-01T12:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.564841 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.564879 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.564891 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.564906 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.564918 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:32Z","lastTransitionTime":"2025-10-01T12:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.667922 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.667969 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.667982 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.668001 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.668014 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:32Z","lastTransitionTime":"2025-10-01T12:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.770603 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.770647 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.770661 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.770679 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.770691 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:32Z","lastTransitionTime":"2025-10-01T12:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.805862 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:32 crc kubenswrapper[4913]: E1001 12:38:32.806048 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.874460 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.874537 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.874562 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.874593 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.874615 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:32Z","lastTransitionTime":"2025-10-01T12:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.977884 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.977944 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.977965 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.977995 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:32 crc kubenswrapper[4913]: I1001 12:38:32.978021 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:32Z","lastTransitionTime":"2025-10-01T12:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.081474 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.081589 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.081608 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.081632 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.081649 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:33Z","lastTransitionTime":"2025-10-01T12:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.184960 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.185033 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.185066 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.185096 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.185116 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:33Z","lastTransitionTime":"2025-10-01T12:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.287742 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.287794 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.287809 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.287827 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.287859 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:33Z","lastTransitionTime":"2025-10-01T12:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.391420 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.391497 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.391516 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.391541 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.391558 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:33Z","lastTransitionTime":"2025-10-01T12:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.493816 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.493875 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.493895 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.493918 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.493937 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:33Z","lastTransitionTime":"2025-10-01T12:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.597176 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.597227 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.597243 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.597282 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.597299 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:33Z","lastTransitionTime":"2025-10-01T12:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.705323 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.705911 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.706096 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.706260 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.706518 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:33Z","lastTransitionTime":"2025-10-01T12:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.805752 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.805752 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:33 crc kubenswrapper[4913]: E1001 12:38:33.805968 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:33 crc kubenswrapper[4913]: E1001 12:38:33.806152 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.805790 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:33 crc kubenswrapper[4913]: E1001 12:38:33.806748 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.809365 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.809553 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.809680 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.809805 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.809978 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:33Z","lastTransitionTime":"2025-10-01T12:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.912734 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.912794 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.912815 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.912841 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:33 crc kubenswrapper[4913]: I1001 12:38:33.912860 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:33Z","lastTransitionTime":"2025-10-01T12:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.015593 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.015624 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.015631 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.015667 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.015677 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:34Z","lastTransitionTime":"2025-10-01T12:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.118365 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.118414 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.118428 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.118450 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.118466 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:34Z","lastTransitionTime":"2025-10-01T12:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.220371 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.220430 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.220448 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.220476 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.220492 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:34Z","lastTransitionTime":"2025-10-01T12:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.323637 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.324075 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.324343 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.324552 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.324688 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:34Z","lastTransitionTime":"2025-10-01T12:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.426913 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.427366 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.427501 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.427640 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.427800 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:34Z","lastTransitionTime":"2025-10-01T12:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.530318 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.531176 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.531359 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.531507 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.531642 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:34Z","lastTransitionTime":"2025-10-01T12:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.633860 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.633926 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.633944 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.633968 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.633987 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:34Z","lastTransitionTime":"2025-10-01T12:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.736721 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.736765 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.736775 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.736792 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.736802 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:34Z","lastTransitionTime":"2025-10-01T12:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.806313 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:34 crc kubenswrapper[4913]: E1001 12:38:34.806471 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.839447 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.839529 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.839557 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.839589 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.839614 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:34Z","lastTransitionTime":"2025-10-01T12:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.941670 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.941730 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.941746 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.941773 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.941791 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:34Z","lastTransitionTime":"2025-10-01T12:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.943105 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.943140 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.943152 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.943165 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.943175 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:34Z","lastTransitionTime":"2025-10-01T12:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:34 crc kubenswrapper[4913]: E1001 12:38:34.957898 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.961937 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.961978 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.961993 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.962013 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.962026 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:34Z","lastTransitionTime":"2025-10-01T12:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:34 crc kubenswrapper[4913]: E1001 12:38:34.973944 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.978247 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.978328 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.978346 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.978370 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:34 crc kubenswrapper[4913]: I1001 12:38:34.978387 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:34Z","lastTransitionTime":"2025-10-01T12:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:34 crc kubenswrapper[4913]: E1001 12:38:34.996726 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.001214 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.001257 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.001292 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.001314 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.001330 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:35Z","lastTransitionTime":"2025-10-01T12:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:35 crc kubenswrapper[4913]: E1001 12:38:35.014515 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.023837 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.023896 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.023909 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.023928 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.023941 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:35Z","lastTransitionTime":"2025-10-01T12:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:35 crc kubenswrapper[4913]: E1001 12:38:35.036783 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:35Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:35 crc kubenswrapper[4913]: E1001 12:38:35.036989 4913 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.044361 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.044397 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.044409 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.044428 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.044440 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:35Z","lastTransitionTime":"2025-10-01T12:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.146224 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.146320 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.146338 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.146363 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.146382 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:35Z","lastTransitionTime":"2025-10-01T12:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.249038 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.249083 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.249094 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.249115 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.249127 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:35Z","lastTransitionTime":"2025-10-01T12:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.351565 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.351628 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.351648 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.351674 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.351691 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:35Z","lastTransitionTime":"2025-10-01T12:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.454588 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.454654 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.454681 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.454710 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.454735 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:35Z","lastTransitionTime":"2025-10-01T12:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.557779 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.557862 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.557887 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.557917 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.557938 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:35Z","lastTransitionTime":"2025-10-01T12:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.661211 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.661253 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.661298 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.661319 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.661333 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:35Z","lastTransitionTime":"2025-10-01T12:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.763634 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.763666 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.763675 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.763687 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.763697 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:35Z","lastTransitionTime":"2025-10-01T12:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.805698 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.805760 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.805723 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:35 crc kubenswrapper[4913]: E1001 12:38:35.805882 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:35 crc kubenswrapper[4913]: E1001 12:38:35.806006 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:35 crc kubenswrapper[4913]: E1001 12:38:35.806133 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.865984 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.866038 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.866055 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.866078 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.866097 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:35Z","lastTransitionTime":"2025-10-01T12:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.968890 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.968953 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.968964 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.968981 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:35 crc kubenswrapper[4913]: I1001 12:38:35.968994 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:35Z","lastTransitionTime":"2025-10-01T12:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.072006 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.072047 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.072059 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.072078 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.072089 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:36Z","lastTransitionTime":"2025-10-01T12:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.174045 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.174086 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.174097 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.174115 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.174129 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:36Z","lastTransitionTime":"2025-10-01T12:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.277136 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.277201 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.277225 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.277257 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.277317 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:36Z","lastTransitionTime":"2025-10-01T12:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.380343 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.380463 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.380486 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.380509 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.380527 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:36Z","lastTransitionTime":"2025-10-01T12:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.483486 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.483542 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.483559 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.483583 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.483599 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:36Z","lastTransitionTime":"2025-10-01T12:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.587197 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.587249 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.587295 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.587321 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.587341 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:36Z","lastTransitionTime":"2025-10-01T12:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.690811 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.690872 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.690897 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.690929 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.690952 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:36Z","lastTransitionTime":"2025-10-01T12:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.714435 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.726095 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.736699 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.756604 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.769863 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.782790 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.793648 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.793713 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.793724 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.793743 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.793758 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:36Z","lastTransitionTime":"2025-10-01T12:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.798168 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.806781 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:36 crc kubenswrapper[4913]: E1001 12:38:36.806929 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.816391 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.828606 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.850873 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ing metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 12:38:19.793164 6348 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 12:38:19.793204 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.864238 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.873794 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.886472 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.896552 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.896594 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.896605 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.896621 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.896633 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:36Z","lastTransitionTime":"2025-10-01T12:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.901625 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.914532 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.929759 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.945665 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.966371 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.999786 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.999829 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.999840 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.999857 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:36 crc kubenswrapper[4913]: I1001 12:38:36.999868 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:36Z","lastTransitionTime":"2025-10-01T12:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.103195 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.103249 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.103287 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.103311 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.103324 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:37Z","lastTransitionTime":"2025-10-01T12:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.205756 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.205830 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.205849 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.206251 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.206490 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:37Z","lastTransitionTime":"2025-10-01T12:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.310413 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.310773 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.310949 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.311146 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.311328 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:37Z","lastTransitionTime":"2025-10-01T12:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.415712 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.415767 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.415782 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.415808 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.415824 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:37Z","lastTransitionTime":"2025-10-01T12:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.518629 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.518661 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.518672 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.518688 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.518700 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:37Z","lastTransitionTime":"2025-10-01T12:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.621913 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.621969 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.621986 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.622008 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.622024 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:37Z","lastTransitionTime":"2025-10-01T12:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.725302 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.725357 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.725369 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.725387 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.725399 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:37Z","lastTransitionTime":"2025-10-01T12:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.806454 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.806478 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.806632 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:37 crc kubenswrapper[4913]: E1001 12:38:37.806833 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:37 crc kubenswrapper[4913]: E1001 12:38:37.807540 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:37 crc kubenswrapper[4913]: E1001 12:38:37.807737 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.827821 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.827896 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.827916 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.827939 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.827958 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:37Z","lastTransitionTime":"2025-10-01T12:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.931677 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.931753 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.931780 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.931811 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:37 crc kubenswrapper[4913]: I1001 12:38:37.931837 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:37Z","lastTransitionTime":"2025-10-01T12:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.034213 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.034299 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.034316 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.034339 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.034355 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:38Z","lastTransitionTime":"2025-10-01T12:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.137181 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.137242 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.137259 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.137317 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.137385 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:38Z","lastTransitionTime":"2025-10-01T12:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.239312 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.239384 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.239404 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.239431 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.239450 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:38Z","lastTransitionTime":"2025-10-01T12:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.342687 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.342864 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.342888 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.342914 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.342935 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:38Z","lastTransitionTime":"2025-10-01T12:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.444664 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.444702 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.444709 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.444724 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.444733 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:38Z","lastTransitionTime":"2025-10-01T12:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.550864 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.551153 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.551248 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.551908 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.551942 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:38Z","lastTransitionTime":"2025-10-01T12:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.654220 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.654780 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.654846 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.654920 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.654980 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:38Z","lastTransitionTime":"2025-10-01T12:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.757299 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.757595 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.757679 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.757751 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.757808 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:38Z","lastTransitionTime":"2025-10-01T12:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.806086 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:38 crc kubenswrapper[4913]: E1001 12:38:38.806491 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.822369 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.833620 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.851907 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.859611 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.859822 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.859914 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.859996 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.860053 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:38Z","lastTransitionTime":"2025-10-01T12:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.869056 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.882999 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.899339 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.917998 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.931192 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45553f35-c72e-4958-95ea-eb7eed4c742f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08aeac6119407218318ebc6ab207bf9b85b592d6a588a9dea2ba38e6d9ed7fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556741f4cc887b7c526046216bfccedeec0a4f62e97b24f4f29538e58e8f9704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4cb0a01560cd286f1a1eb344316f430de87357fc7019e4610f276ac0fcdcb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.944914 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.958504 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.961964 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.962007 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.962019 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.962037 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.962050 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:38Z","lastTransitionTime":"2025-10-01T12:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.971090 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.986103 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:38 crc kubenswrapper[4913]: I1001 12:38:38.997245 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.008921 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.020757 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.035912 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ing metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 12:38:19.793164 6348 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 12:38:19.793204 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.051009 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.063560 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.063601 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.063613 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.063628 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.063639 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:39Z","lastTransitionTime":"2025-10-01T12:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.166077 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.166112 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.166123 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.166141 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.166153 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:39Z","lastTransitionTime":"2025-10-01T12:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.268827 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.268880 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.268895 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.268915 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.268930 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:39Z","lastTransitionTime":"2025-10-01T12:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.371096 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.371147 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.371164 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.371185 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.371202 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:39Z","lastTransitionTime":"2025-10-01T12:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.474165 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.474230 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.474251 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.474312 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.474352 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:39Z","lastTransitionTime":"2025-10-01T12:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.577730 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.577805 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.577817 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.577835 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.577849 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:39Z","lastTransitionTime":"2025-10-01T12:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.617172 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs\") pod \"network-metrics-daemon-8c8wp\" (UID: \"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\") " pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.617416 4913 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.617504 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs podName:c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:55.617480542 +0000 UTC m=+67.520956150 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs") pod "network-metrics-daemon-8c8wp" (UID: "c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.680851 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.680920 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.680936 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.680964 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.680981 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:39Z","lastTransitionTime":"2025-10-01T12:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.784019 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.784063 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.784080 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.784103 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.784120 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:39Z","lastTransitionTime":"2025-10-01T12:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.806656 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.806732 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.806746 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.806899 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.807115 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.807854 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.808303 4913 scope.go:117] "RemoveContainer" containerID="cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.886188 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.886215 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.886224 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.886237 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.886246 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:39Z","lastTransitionTime":"2025-10-01T12:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.920021 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.920128 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.920148 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.920218 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:11.920183692 +0000 UTC m=+83.823659300 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.920230 4913 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.920347 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:39:11.920333606 +0000 UTC m=+83.823809194 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.920591 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.920623 4913 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.920678 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.920692 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.920703 4913 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.920682 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:39:11.920669176 +0000 UTC m=+83.824144774 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.921045 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:39:11.921023726 +0000 UTC m=+83.824499344 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.920630 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.921144 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.921160 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.921173 4913 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:39 crc kubenswrapper[4913]: E1001 12:38:39.921307 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:39:11.921252812 +0000 UTC m=+83.824728390 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.988419 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.988460 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.988472 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.988490 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:39 crc kubenswrapper[4913]: I1001 12:38:39.988501 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:39Z","lastTransitionTime":"2025-10-01T12:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.091111 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.091154 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.091167 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.091184 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.091196 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:40Z","lastTransitionTime":"2025-10-01T12:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.162916 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/1.log" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.164829 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.166245 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerStarted","Data":"a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d"} Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.166637 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.180837 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.193132 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.193166 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.193177 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.193194 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.193206 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:40Z","lastTransitionTime":"2025-10-01T12:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.203589 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ing metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 12:38:19.793164 6348 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 12:38:19.793204 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.215583 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.230849 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.247748 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.264317 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.286692 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.295883 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.295931 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.295945 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.295962 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.295974 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:40Z","lastTransitionTime":"2025-10-01T12:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.300820 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.312873 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.326851 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.342327 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.352966 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.364042 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.372762 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.384750 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.393773 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.401158 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.401218 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.401229 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.401248 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.401261 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:40Z","lastTransitionTime":"2025-10-01T12:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.411212 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45553f35-c72e-4958-95ea-eb7eed4c742f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08aeac6119407218318ebc6ab207bf9b85b592d6a588a9dea2ba38e6d9ed7fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556741f4cc887b7c526046216bfccedeec0a4f62e97b24f4f29538e58e8f9704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4cb0a01560cd286f1a1eb344316f430de87357fc7019e4610f276ac0fcdcb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.434959 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ing metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 12:38:19.793164 6348 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 12:38:19.793204 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.448790 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.464197 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.476293 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.488847 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.499538 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.502981 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.503086 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.503151 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.503221 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.503311 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:40Z","lastTransitionTime":"2025-10-01T12:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.511655 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.521175 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.533509 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.543675 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.554997 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.565645 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.574608 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.588420 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.597387 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.606341 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.606378 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.606388 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.606402 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.606419 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:40Z","lastTransitionTime":"2025-10-01T12:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.608917 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45553f35-c72e-4958-95ea-eb7eed4c742f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08aeac6119407218318ebc6ab207bf9b85b592d6a588a9dea2ba38e6d9ed7fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556741f4cc887b7c526046216bfccedeec0a4f62e97b24f4f29538e58e8f9704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4cb0a01560cd286f1a1eb344316f430de87357fc7019e4610f276ac0fcdcb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.621568 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.708021 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.708048 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.708058 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.708072 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.708083 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:40Z","lastTransitionTime":"2025-10-01T12:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.807449 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:40 crc kubenswrapper[4913]: E1001 12:38:40.807565 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.810186 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.810456 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.810638 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.810842 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.811018 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:40Z","lastTransitionTime":"2025-10-01T12:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.913959 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.913997 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.914006 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.914019 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:40 crc kubenswrapper[4913]: I1001 12:38:40.914029 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:40Z","lastTransitionTime":"2025-10-01T12:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.017412 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.017702 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.017839 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.017975 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.018134 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:41Z","lastTransitionTime":"2025-10-01T12:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.120813 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.120842 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.120850 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.120863 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.120872 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:41Z","lastTransitionTime":"2025-10-01T12:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.170183 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/2.log" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.170863 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/1.log" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.173508 4913 generic.go:334] "Generic (PLEG): container finished" podID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerID="a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d" exitCode=1 Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.173554 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerDied","Data":"a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d"} Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.173604 4913 scope.go:117] "RemoveContainer" containerID="cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.174057 4913 scope.go:117] "RemoveContainer" containerID="a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d" Oct 01 12:38:41 crc kubenswrapper[4913]: E1001 12:38:41.174231 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.195493 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.217405 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbef574a810c1d91814ac3e5f87990d83ad955697ef10da85b0f25bf85ad27e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ing metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 12:38:19.793164 6348 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 12:38:19.793204 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"message\\\":\\\" fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661639 6611 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:38:40.661643 6611 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661686 6611 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 12:38:40.661748 6611 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.223299 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.223326 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.223335 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.223348 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.223357 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:41Z","lastTransitionTime":"2025-10-01T12:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.228388 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.245319 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.258676 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.274518 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.286601 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.301644 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.318520 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.331562 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.331874 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.332006 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.332135 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.332261 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:41Z","lastTransitionTime":"2025-10-01T12:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.334936 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.352291 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45553f35-c72e-4958-95ea-eb7eed4c742f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08aeac6119407218318ebc6ab207bf9b85b592d6a588a9dea2ba38e6d9ed7fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556741f4cc887b7c526046216bfccedeec0a4f62e97b24f4f29538e58e8f9704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4cb0a01560cd286f1a1eb344316f430de87357fc7019e4610f276ac0fcdcb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.370116 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.387149 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.402237 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.416657 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.430663 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.435592 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.435646 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.435662 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.435686 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.435704 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:41Z","lastTransitionTime":"2025-10-01T12:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.447525 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:41Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.538661 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.538708 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.538722 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.538740 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.538752 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:41Z","lastTransitionTime":"2025-10-01T12:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.640833 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.640899 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.640916 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.640940 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.640957 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:41Z","lastTransitionTime":"2025-10-01T12:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.744097 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.744159 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.744176 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.744203 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.744223 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:41Z","lastTransitionTime":"2025-10-01T12:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.806063 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.806121 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:41 crc kubenswrapper[4913]: E1001 12:38:41.806248 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.806373 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:41 crc kubenswrapper[4913]: E1001 12:38:41.806552 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:41 crc kubenswrapper[4913]: E1001 12:38:41.806730 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.847673 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.847749 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.847775 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.847808 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.847831 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:41Z","lastTransitionTime":"2025-10-01T12:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.950012 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.950049 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.950057 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.950070 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:41 crc kubenswrapper[4913]: I1001 12:38:41.950079 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:41Z","lastTransitionTime":"2025-10-01T12:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.052246 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.052350 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.052373 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.052396 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.052413 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:42Z","lastTransitionTime":"2025-10-01T12:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.155369 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.155418 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.155437 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.155460 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.155478 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:42Z","lastTransitionTime":"2025-10-01T12:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.177741 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/2.log" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.181000 4913 scope.go:117] "RemoveContainer" containerID="a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d" Oct 01 12:38:42 crc kubenswrapper[4913]: E1001 12:38:42.181147 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.199121 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.226536 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"message\\\":\\\" fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661639 6611 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:38:40.661643 6611 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661686 6611 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 12:38:40.661748 6611 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.241042 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.257001 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.258494 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.258557 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.258580 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.258611 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.258634 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:42Z","lastTransitionTime":"2025-10-01T12:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.270910 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.279944 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.291809 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.303795 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.313684 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.322394 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.332436 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.342111 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45553f35-c72e-4958-95ea-eb7eed4c742f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08aeac6119407218318ebc6ab207bf9b85b592d6a588a9dea2ba38e6d9ed7fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556741f4cc887b7c526046216bfccedeec0a4f62e97b24f4f29538e58e8f9704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4cb0a01560cd286f1a1eb344316f430de87357fc7019e4610f276ac0fcdcb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.354766 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.360756 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.360796 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.360807 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.360823 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.360833 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:42Z","lastTransitionTime":"2025-10-01T12:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.367359 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.377980 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.392225 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.410802 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.464905 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.464960 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.464979 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.465002 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.465019 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:42Z","lastTransitionTime":"2025-10-01T12:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.567376 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.567478 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.567498 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.567522 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.567540 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:42Z","lastTransitionTime":"2025-10-01T12:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.671482 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.671541 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.671563 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.671593 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.671615 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:42Z","lastTransitionTime":"2025-10-01T12:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.774120 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.774228 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.774246 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.774303 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.774321 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:42Z","lastTransitionTime":"2025-10-01T12:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.805906 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:42 crc kubenswrapper[4913]: E1001 12:38:42.806119 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.877411 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.877475 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.877490 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.877511 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.877527 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:42Z","lastTransitionTime":"2025-10-01T12:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.980907 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.980973 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.980998 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.981027 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:42 crc kubenswrapper[4913]: I1001 12:38:42.981048 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:42Z","lastTransitionTime":"2025-10-01T12:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.084234 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.084322 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.084341 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.084362 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.084380 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:43Z","lastTransitionTime":"2025-10-01T12:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.186691 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.186719 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.186727 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.186740 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.186748 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:43Z","lastTransitionTime":"2025-10-01T12:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.290229 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.290326 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.290350 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.290377 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.290395 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:43Z","lastTransitionTime":"2025-10-01T12:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.392872 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.392919 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.392933 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.392954 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.392967 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:43Z","lastTransitionTime":"2025-10-01T12:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.496357 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.496423 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.496441 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.496464 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.496485 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:43Z","lastTransitionTime":"2025-10-01T12:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.599238 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.599318 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.599332 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.599351 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.599367 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:43Z","lastTransitionTime":"2025-10-01T12:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.721966 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.722003 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.722012 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.722027 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.722036 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:43Z","lastTransitionTime":"2025-10-01T12:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.806740 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.806750 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:43 crc kubenswrapper[4913]: E1001 12:38:43.806954 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.806783 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:43 crc kubenswrapper[4913]: E1001 12:38:43.807142 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:43 crc kubenswrapper[4913]: E1001 12:38:43.807315 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.824850 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.824920 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.824939 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.824965 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.824983 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:43Z","lastTransitionTime":"2025-10-01T12:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.927082 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.927134 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.927145 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.927163 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:43 crc kubenswrapper[4913]: I1001 12:38:43.927176 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:43Z","lastTransitionTime":"2025-10-01T12:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.030091 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.030147 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.030166 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.030191 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.030210 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:44Z","lastTransitionTime":"2025-10-01T12:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.134334 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.134401 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.134420 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.134445 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.134463 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:44Z","lastTransitionTime":"2025-10-01T12:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.237436 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.237496 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.237513 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.237537 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.237556 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:44Z","lastTransitionTime":"2025-10-01T12:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.341032 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.341092 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.341110 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.341140 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.341164 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:44Z","lastTransitionTime":"2025-10-01T12:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.443673 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.443713 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.443724 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.443738 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.443748 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:44Z","lastTransitionTime":"2025-10-01T12:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.546824 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.546892 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.546917 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.546947 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.546970 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:44Z","lastTransitionTime":"2025-10-01T12:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.650543 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.650620 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.650638 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.650663 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.650681 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:44Z","lastTransitionTime":"2025-10-01T12:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.754122 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.754570 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.754779 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.754948 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.755099 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:44Z","lastTransitionTime":"2025-10-01T12:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.805953 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:44 crc kubenswrapper[4913]: E1001 12:38:44.806606 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.857677 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.858351 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.858376 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.858394 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.858407 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:44Z","lastTransitionTime":"2025-10-01T12:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.961408 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.961474 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.961497 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.961525 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:44 crc kubenswrapper[4913]: I1001 12:38:44.961546 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:44Z","lastTransitionTime":"2025-10-01T12:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.064366 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.064416 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.064434 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.064459 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.064476 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:45Z","lastTransitionTime":"2025-10-01T12:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.167216 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.167306 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.167323 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.167346 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.167363 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:45Z","lastTransitionTime":"2025-10-01T12:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.270350 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.270426 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.270446 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.270469 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.270488 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:45Z","lastTransitionTime":"2025-10-01T12:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.304856 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.304920 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.304945 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.304974 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.304997 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:45Z","lastTransitionTime":"2025-10-01T12:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:45 crc kubenswrapper[4913]: E1001 12:38:45.326581 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.333187 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.333258 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.333324 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.333351 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.333369 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:45Z","lastTransitionTime":"2025-10-01T12:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:45 crc kubenswrapper[4913]: E1001 12:38:45.350597 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.360529 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.360605 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.360623 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.360695 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.360714 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:45Z","lastTransitionTime":"2025-10-01T12:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:45 crc kubenswrapper[4913]: E1001 12:38:45.379383 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.384738 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.384779 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.384789 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.384803 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.384850 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:45Z","lastTransitionTime":"2025-10-01T12:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:45 crc kubenswrapper[4913]: E1001 12:38:45.407021 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.411997 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.412043 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.412052 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.412068 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.412079 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:45Z","lastTransitionTime":"2025-10-01T12:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:45 crc kubenswrapper[4913]: E1001 12:38:45.431021 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:45Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:45 crc kubenswrapper[4913]: E1001 12:38:45.431687 4913 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.433819 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.433881 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.433904 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.433932 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.433954 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:45Z","lastTransitionTime":"2025-10-01T12:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.537134 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.537171 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.537182 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.537196 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.537223 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:45Z","lastTransitionTime":"2025-10-01T12:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.639904 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.639941 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.639949 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.639964 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.639975 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:45Z","lastTransitionTime":"2025-10-01T12:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.742389 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.742456 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.742473 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.742497 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.742514 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:45Z","lastTransitionTime":"2025-10-01T12:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.806906 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.806970 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.806999 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:45 crc kubenswrapper[4913]: E1001 12:38:45.807085 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:45 crc kubenswrapper[4913]: E1001 12:38:45.807310 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:45 crc kubenswrapper[4913]: E1001 12:38:45.807466 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.844926 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.844977 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.844988 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.845008 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.845021 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:45Z","lastTransitionTime":"2025-10-01T12:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.948917 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.948986 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.949001 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.949018 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:45 crc kubenswrapper[4913]: I1001 12:38:45.949032 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:45Z","lastTransitionTime":"2025-10-01T12:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.052066 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.052117 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.052129 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.052144 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.052155 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:46Z","lastTransitionTime":"2025-10-01T12:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.154498 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.154565 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.154580 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.154602 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.154620 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:46Z","lastTransitionTime":"2025-10-01T12:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.257766 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.257802 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.257813 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.257828 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.257839 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:46Z","lastTransitionTime":"2025-10-01T12:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.360548 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.360605 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.360626 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.360652 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.360674 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:46Z","lastTransitionTime":"2025-10-01T12:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.462813 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.462849 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.462858 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.462890 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.462901 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:46Z","lastTransitionTime":"2025-10-01T12:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.565777 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.565831 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.565847 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.565871 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.565887 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:46Z","lastTransitionTime":"2025-10-01T12:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.669581 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.669617 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.669628 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.669644 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.669656 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:46Z","lastTransitionTime":"2025-10-01T12:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.772805 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.772860 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.772876 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.772899 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.772916 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:46Z","lastTransitionTime":"2025-10-01T12:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.806666 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:46 crc kubenswrapper[4913]: E1001 12:38:46.806848 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.875520 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.875566 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.875582 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.875606 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.875630 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:46Z","lastTransitionTime":"2025-10-01T12:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.979120 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.979174 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.979193 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.979217 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:46 crc kubenswrapper[4913]: I1001 12:38:46.979234 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:46Z","lastTransitionTime":"2025-10-01T12:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.082613 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.082683 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.082723 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.082754 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.082774 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:47Z","lastTransitionTime":"2025-10-01T12:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.185462 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.185496 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.185505 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.185518 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.185527 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:47Z","lastTransitionTime":"2025-10-01T12:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.287764 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.287823 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.287841 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.287866 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.287889 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:47Z","lastTransitionTime":"2025-10-01T12:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.390156 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.390213 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.390238 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.390306 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.390333 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:47Z","lastTransitionTime":"2025-10-01T12:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.493169 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.493233 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.493249 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.493300 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.493317 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:47Z","lastTransitionTime":"2025-10-01T12:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.596246 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.596340 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.596358 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.596385 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.596403 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:47Z","lastTransitionTime":"2025-10-01T12:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.699137 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.699213 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.699235 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.699258 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.699311 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:47Z","lastTransitionTime":"2025-10-01T12:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.802406 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.802452 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.802468 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.802488 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.802504 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:47Z","lastTransitionTime":"2025-10-01T12:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.806112 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.806144 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.806212 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:47 crc kubenswrapper[4913]: E1001 12:38:47.806350 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:47 crc kubenswrapper[4913]: E1001 12:38:47.806424 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:47 crc kubenswrapper[4913]: E1001 12:38:47.806864 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.904960 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.905019 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.905038 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.905064 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:47 crc kubenswrapper[4913]: I1001 12:38:47.905082 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:47Z","lastTransitionTime":"2025-10-01T12:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.007012 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.007078 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.007096 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.007122 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.007138 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:48Z","lastTransitionTime":"2025-10-01T12:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.109911 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.109958 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.109969 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.109997 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.110010 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:48Z","lastTransitionTime":"2025-10-01T12:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.212762 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.212803 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.212811 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.212826 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.212835 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:48Z","lastTransitionTime":"2025-10-01T12:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.316183 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.316244 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.316260 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.316317 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.316336 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:48Z","lastTransitionTime":"2025-10-01T12:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.419132 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.419202 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.419219 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.419246 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.419302 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:48Z","lastTransitionTime":"2025-10-01T12:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.521402 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.521483 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.521502 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.521532 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.521554 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:48Z","lastTransitionTime":"2025-10-01T12:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.623856 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.623891 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.623900 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.623916 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.623925 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:48Z","lastTransitionTime":"2025-10-01T12:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.726306 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.726367 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.726388 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.726412 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.726430 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:48Z","lastTransitionTime":"2025-10-01T12:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.805626 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:48 crc kubenswrapper[4913]: E1001 12:38:48.805721 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.818364 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.830367 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.830459 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.830477 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.830506 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.830526 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:48Z","lastTransitionTime":"2025-10-01T12:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.836149 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.864591 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"message\\\":\\\" fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661639 6611 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:38:40.661643 6611 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661686 6611 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 12:38:40.661748 6611 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.878821 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.895754 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.908230 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.923259 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.933384 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.933446 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.933470 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.933502 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.933525 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:48Z","lastTransitionTime":"2025-10-01T12:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.937602 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.948778 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.963176 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.977347 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:48 crc kubenswrapper[4913]: I1001 12:38:48.989602 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.002345 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.013257 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.026225 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45553f35-c72e-4958-95ea-eb7eed4c742f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08aeac6119407218318ebc6ab207bf9b85b592d6a588a9dea2ba38e6d9ed7fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556741f4cc887b7c526046216bfccedeec0a4f62e97b24f4f29538e58e8f9704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4cb0a01560cd286f1a1eb344316f430de87357fc7019e4610f276ac0fcdcb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.036377 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.036418 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.036426 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.036456 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.036466 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:49Z","lastTransitionTime":"2025-10-01T12:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.041915 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.058445 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.138561 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.138602 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.138615 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.138633 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.138645 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:49Z","lastTransitionTime":"2025-10-01T12:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.241042 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.241097 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.241114 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.241139 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.241156 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:49Z","lastTransitionTime":"2025-10-01T12:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.344357 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.344620 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.344631 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.344648 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.344676 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:49Z","lastTransitionTime":"2025-10-01T12:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.447139 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.447179 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.447191 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.447206 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.447217 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:49Z","lastTransitionTime":"2025-10-01T12:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.550023 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.550108 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.550131 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.550162 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.550185 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:49Z","lastTransitionTime":"2025-10-01T12:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.653212 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.653334 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.653368 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.653393 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.653411 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:49Z","lastTransitionTime":"2025-10-01T12:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.756304 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.756362 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.756379 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.756404 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.756422 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:49Z","lastTransitionTime":"2025-10-01T12:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.806500 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.806559 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.806578 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:49 crc kubenswrapper[4913]: E1001 12:38:49.806731 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:49 crc kubenswrapper[4913]: E1001 12:38:49.806866 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:49 crc kubenswrapper[4913]: E1001 12:38:49.806974 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.859623 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.859685 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.859698 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.859718 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.859731 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:49Z","lastTransitionTime":"2025-10-01T12:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.962504 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.962568 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.962586 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.962611 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:49 crc kubenswrapper[4913]: I1001 12:38:49.962629 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:49Z","lastTransitionTime":"2025-10-01T12:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.064885 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.064927 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.064937 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.064960 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.064972 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:50Z","lastTransitionTime":"2025-10-01T12:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.168146 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.168196 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.168210 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.168227 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.168238 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:50Z","lastTransitionTime":"2025-10-01T12:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.271759 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.271791 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.272043 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.272066 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.272076 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:50Z","lastTransitionTime":"2025-10-01T12:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.375720 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.375790 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.375815 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.375846 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.375869 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:50Z","lastTransitionTime":"2025-10-01T12:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.478494 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.478557 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.478583 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.478614 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.478640 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:50Z","lastTransitionTime":"2025-10-01T12:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.581880 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.581921 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.581935 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.581954 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.581967 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:50Z","lastTransitionTime":"2025-10-01T12:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.684889 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.684936 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.684952 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.684979 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.684996 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:50Z","lastTransitionTime":"2025-10-01T12:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.787368 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.787414 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.787422 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.787434 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.787446 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:50Z","lastTransitionTime":"2025-10-01T12:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.806765 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:50 crc kubenswrapper[4913]: E1001 12:38:50.806918 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.889867 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.889905 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.889917 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.889932 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.889942 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:50Z","lastTransitionTime":"2025-10-01T12:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.992509 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.992533 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.992541 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.992555 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:50 crc kubenswrapper[4913]: I1001 12:38:50.992563 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:50Z","lastTransitionTime":"2025-10-01T12:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.094589 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.094656 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.094667 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.094686 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.094699 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:51Z","lastTransitionTime":"2025-10-01T12:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.197228 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.197295 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.197309 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.197328 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.197338 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:51Z","lastTransitionTime":"2025-10-01T12:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.299964 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.300014 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.300026 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.300044 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.300059 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:51Z","lastTransitionTime":"2025-10-01T12:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.402177 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.402204 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.402214 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.402228 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.402239 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:51Z","lastTransitionTime":"2025-10-01T12:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.505111 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.505174 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.505186 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.505205 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.505215 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:51Z","lastTransitionTime":"2025-10-01T12:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.607669 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.607708 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.607717 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.607734 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.607746 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:51Z","lastTransitionTime":"2025-10-01T12:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.710850 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.710884 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.710892 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.710905 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.710914 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:51Z","lastTransitionTime":"2025-10-01T12:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.806039 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.806063 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:51 crc kubenswrapper[4913]: E1001 12:38:51.806158 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.806257 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:51 crc kubenswrapper[4913]: E1001 12:38:51.806426 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:51 crc kubenswrapper[4913]: E1001 12:38:51.806620 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.817363 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.817410 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.817421 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.817437 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.817449 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:51Z","lastTransitionTime":"2025-10-01T12:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.920128 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.920164 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.920181 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.920196 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:51 crc kubenswrapper[4913]: I1001 12:38:51.920207 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:51Z","lastTransitionTime":"2025-10-01T12:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.022942 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.023006 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.023015 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.023028 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.023039 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:52Z","lastTransitionTime":"2025-10-01T12:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.125576 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.125639 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.125658 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.125682 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.125703 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:52Z","lastTransitionTime":"2025-10-01T12:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.227551 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.227600 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.227612 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.227633 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.227645 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:52Z","lastTransitionTime":"2025-10-01T12:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.329957 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.330027 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.330049 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.330078 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.330098 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:52Z","lastTransitionTime":"2025-10-01T12:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.433008 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.433049 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.433061 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.433077 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.433087 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:52Z","lastTransitionTime":"2025-10-01T12:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.535539 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.535587 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.535599 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.535615 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.535626 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:52Z","lastTransitionTime":"2025-10-01T12:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.638146 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.638236 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.638247 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.638288 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.638298 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:52Z","lastTransitionTime":"2025-10-01T12:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.740189 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.740225 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.740232 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.740245 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.740254 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:52Z","lastTransitionTime":"2025-10-01T12:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.810773 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:52 crc kubenswrapper[4913]: E1001 12:38:52.818971 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.842380 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.842425 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.842439 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.842463 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.842478 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:52Z","lastTransitionTime":"2025-10-01T12:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.945314 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.945343 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.945354 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.945370 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:52 crc kubenswrapper[4913]: I1001 12:38:52.945382 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:52Z","lastTransitionTime":"2025-10-01T12:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.048448 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.048492 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.048510 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.048533 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.048549 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:53Z","lastTransitionTime":"2025-10-01T12:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.151593 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.151638 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.151647 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.151661 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.151670 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:53Z","lastTransitionTime":"2025-10-01T12:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.253810 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.253836 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.253844 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.253857 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.253866 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:53Z","lastTransitionTime":"2025-10-01T12:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.355875 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.355907 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.355918 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.355932 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.355942 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:53Z","lastTransitionTime":"2025-10-01T12:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.458239 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.458297 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.458307 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.458322 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.458332 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:53Z","lastTransitionTime":"2025-10-01T12:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.560897 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.560965 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.560981 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.561006 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.561024 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:53Z","lastTransitionTime":"2025-10-01T12:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.663982 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.664021 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.664030 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.664045 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.664055 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:53Z","lastTransitionTime":"2025-10-01T12:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.765758 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.765799 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.765811 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.765825 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.765838 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:53Z","lastTransitionTime":"2025-10-01T12:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.806633 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.806646 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.806754 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:53 crc kubenswrapper[4913]: E1001 12:38:53.806942 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:53 crc kubenswrapper[4913]: E1001 12:38:53.807081 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:53 crc kubenswrapper[4913]: E1001 12:38:53.807196 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.868453 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.868576 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.868602 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.868633 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.868690 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:53Z","lastTransitionTime":"2025-10-01T12:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.970633 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.970671 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.970682 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.970697 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:53 crc kubenswrapper[4913]: I1001 12:38:53.970707 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:53Z","lastTransitionTime":"2025-10-01T12:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.072740 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.072787 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.072799 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.072815 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.072827 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:54Z","lastTransitionTime":"2025-10-01T12:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.174905 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.174943 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.174954 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.174971 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.174982 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:54Z","lastTransitionTime":"2025-10-01T12:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.277475 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.277519 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.277588 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.277607 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.277619 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:54Z","lastTransitionTime":"2025-10-01T12:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.379891 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.379939 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.379949 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.379966 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.379977 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:54Z","lastTransitionTime":"2025-10-01T12:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.482045 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.482073 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.482083 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.482095 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.482105 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:54Z","lastTransitionTime":"2025-10-01T12:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.584381 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.584434 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.584454 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.584484 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.584507 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:54Z","lastTransitionTime":"2025-10-01T12:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.686606 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.686670 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.686682 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.686698 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.686707 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:54Z","lastTransitionTime":"2025-10-01T12:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.788460 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.788495 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.788505 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.788519 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.788529 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:54Z","lastTransitionTime":"2025-10-01T12:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.805998 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:54 crc kubenswrapper[4913]: E1001 12:38:54.806137 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.890526 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.890565 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.890574 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.890589 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.890597 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:54Z","lastTransitionTime":"2025-10-01T12:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.992801 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.993081 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.993152 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.993382 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:54 crc kubenswrapper[4913]: I1001 12:38:54.993455 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:54Z","lastTransitionTime":"2025-10-01T12:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.095691 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.095726 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.095737 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.095751 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.095762 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:55Z","lastTransitionTime":"2025-10-01T12:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.197784 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.198326 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.198435 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.198525 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.198618 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:55Z","lastTransitionTime":"2025-10-01T12:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.301011 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.301062 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.301072 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.301085 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.301094 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:55Z","lastTransitionTime":"2025-10-01T12:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.403655 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.403694 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.403704 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.403719 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.403731 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:55Z","lastTransitionTime":"2025-10-01T12:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.470979 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.471017 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.471026 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.471041 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.471051 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:55Z","lastTransitionTime":"2025-10-01T12:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:55 crc kubenswrapper[4913]: E1001 12:38:55.481506 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:55Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.484570 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.484615 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.484626 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.484638 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.484647 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:55Z","lastTransitionTime":"2025-10-01T12:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:55 crc kubenswrapper[4913]: E1001 12:38:55.495858 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:55Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.501470 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.501796 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.501827 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.501846 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.501858 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:55Z","lastTransitionTime":"2025-10-01T12:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:55 crc kubenswrapper[4913]: E1001 12:38:55.512520 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:55Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.516177 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.516211 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.516225 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.516243 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.516254 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:55Z","lastTransitionTime":"2025-10-01T12:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:55 crc kubenswrapper[4913]: E1001 12:38:55.526429 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:55Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.529531 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.529566 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.529575 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.529605 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.529617 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:55Z","lastTransitionTime":"2025-10-01T12:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:55 crc kubenswrapper[4913]: E1001 12:38:55.540221 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:55Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:55 crc kubenswrapper[4913]: E1001 12:38:55.540358 4913 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.541500 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.541530 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.541541 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.541555 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.541565 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:55Z","lastTransitionTime":"2025-10-01T12:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.643909 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.643966 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.643978 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.643993 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.644004 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:55Z","lastTransitionTime":"2025-10-01T12:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.686842 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs\") pod \"network-metrics-daemon-8c8wp\" (UID: \"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\") " pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:55 crc kubenswrapper[4913]: E1001 12:38:55.686969 4913 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:55 crc kubenswrapper[4913]: E1001 12:38:55.687021 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs podName:c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18 nodeName:}" failed. No retries permitted until 2025-10-01 12:39:27.687007583 +0000 UTC m=+99.590483161 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs") pod "network-metrics-daemon-8c8wp" (UID: "c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.746219 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.746249 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.746259 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.746303 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.746312 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:55Z","lastTransitionTime":"2025-10-01T12:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.807019 4913 scope.go:117] "RemoveContainer" containerID="a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d" Oct 01 12:38:55 crc kubenswrapper[4913]: E1001 12:38:55.807221 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.807387 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:55 crc kubenswrapper[4913]: E1001 12:38:55.807438 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.807535 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:55 crc kubenswrapper[4913]: E1001 12:38:55.807581 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.807675 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:55 crc kubenswrapper[4913]: E1001 12:38:55.807724 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.848302 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.848329 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.848337 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.848350 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.848359 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:55Z","lastTransitionTime":"2025-10-01T12:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.950822 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.950859 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.950867 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.950879 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:55 crc kubenswrapper[4913]: I1001 12:38:55.950888 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:55Z","lastTransitionTime":"2025-10-01T12:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.053284 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.053322 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.053331 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.053346 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.053356 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:56Z","lastTransitionTime":"2025-10-01T12:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.155869 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.155911 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.155924 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.155941 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.155955 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:56Z","lastTransitionTime":"2025-10-01T12:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.259108 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.259159 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.259171 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.259190 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.259203 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:56Z","lastTransitionTime":"2025-10-01T12:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.361481 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.361521 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.361532 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.361547 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.361557 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:56Z","lastTransitionTime":"2025-10-01T12:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.463798 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.463849 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.463865 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.463885 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.463894 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:56Z","lastTransitionTime":"2025-10-01T12:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.565961 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.565998 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.566007 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.566022 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.566034 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:56Z","lastTransitionTime":"2025-10-01T12:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.668288 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.668324 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.668334 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.668347 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.668357 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:56Z","lastTransitionTime":"2025-10-01T12:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.770917 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.770946 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.770956 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.770973 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.770982 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:56Z","lastTransitionTime":"2025-10-01T12:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.805683 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:56 crc kubenswrapper[4913]: E1001 12:38:56.805833 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.872809 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.872854 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.872865 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.872881 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.872891 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:56Z","lastTransitionTime":"2025-10-01T12:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.974752 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.974781 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.974790 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.974805 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:56 crc kubenswrapper[4913]: I1001 12:38:56.974815 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:56Z","lastTransitionTime":"2025-10-01T12:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.076912 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.076947 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.076956 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.076970 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.076979 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:57Z","lastTransitionTime":"2025-10-01T12:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.180216 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.180243 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.180252 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.180278 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.180288 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:57Z","lastTransitionTime":"2025-10-01T12:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.226712 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqn52_b2420adf-64bd-4d67-ac95-9337ed10149a/kube-multus/0.log" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.226782 4913 generic.go:334] "Generic (PLEG): container finished" podID="b2420adf-64bd-4d67-ac95-9337ed10149a" containerID="5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0" exitCode=1 Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.226820 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqn52" event={"ID":"b2420adf-64bd-4d67-ac95-9337ed10149a","Type":"ContainerDied","Data":"5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0"} Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.227333 4913 scope.go:117] "RemoveContainer" containerID="5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.238482 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:56Z\\\",\\\"message\\\":\\\"2025-10-01T12:38:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_75738a46-e332-4bd0-b5b6-5560c3daeae9\\\\n2025-10-01T12:38:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_75738a46-e332-4bd0-b5b6-5560c3daeae9 to /host/opt/cni/bin/\\\\n2025-10-01T12:38:11Z [verbose] multus-daemon started\\\\n2025-10-01T12:38:11Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:38:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.250644 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.268941 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"message\\\":\\\" fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661639 6611 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:38:40.661643 6611 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661686 6611 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 12:38:40.661748 6611 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.278840 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.282234 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.282310 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.282334 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.282365 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.282387 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:57Z","lastTransitionTime":"2025-10-01T12:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.290414 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.304401 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.313343 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.325801 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.337360 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.350246 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.360406 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.369355 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.379119 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45553f35-c72e-4958-95ea-eb7eed4c742f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08aeac6119407218318ebc6ab207bf9b85b592d6a588a9dea2ba38e6d9ed7fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556741f4cc887b7c526046216bfccedeec0a4f62e97b24f4f29538e58e8f9704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4cb0a01560cd286f1a1eb344316f430de87357fc7019e4610f276ac0fcdcb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.384961 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.385007 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.385020 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.385038 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.385050 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:57Z","lastTransitionTime":"2025-10-01T12:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.389774 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.402600 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.412322 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.421771 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:57Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.487672 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.487715 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.487726 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.487743 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.487754 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:57Z","lastTransitionTime":"2025-10-01T12:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.589573 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.589602 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.589611 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.589624 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.589633 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:57Z","lastTransitionTime":"2025-10-01T12:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.691524 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.691559 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.691569 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.691590 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.691603 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:57Z","lastTransitionTime":"2025-10-01T12:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.793481 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.793507 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.793516 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.793529 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.793538 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:57Z","lastTransitionTime":"2025-10-01T12:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.805836 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.805836 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:57 crc kubenswrapper[4913]: E1001 12:38:57.805949 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:57 crc kubenswrapper[4913]: E1001 12:38:57.806014 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.806058 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:57 crc kubenswrapper[4913]: E1001 12:38:57.806377 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.895069 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.895108 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.895122 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.895135 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.895146 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:57Z","lastTransitionTime":"2025-10-01T12:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.997952 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.997990 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.998002 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.998017 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:57 crc kubenswrapper[4913]: I1001 12:38:57.998027 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:57Z","lastTransitionTime":"2025-10-01T12:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.100446 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.100482 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.100490 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.100503 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.100515 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:58Z","lastTransitionTime":"2025-10-01T12:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.203560 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.203619 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.203636 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.203659 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.203675 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:58Z","lastTransitionTime":"2025-10-01T12:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.230701 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqn52_b2420adf-64bd-4d67-ac95-9337ed10149a/kube-multus/0.log" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.230750 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqn52" event={"ID":"b2420adf-64bd-4d67-ac95-9337ed10149a","Type":"ContainerStarted","Data":"4a3180a2137b82f1695cd2f7e650f7d36760bc1fb62e77a131035c38edf8b9a9"} Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.249495 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.263048 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.274342 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.287875 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.305940 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.305967 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.305976 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.305989 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.305999 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:58Z","lastTransitionTime":"2025-10-01T12:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.309883 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.320035 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.338031 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.350399 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.358670 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.371758 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.381802 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.392153 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45553f35-c72e-4958-95ea-eb7eed4c742f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08aeac6119407218318ebc6ab207bf9b85b592d6a588a9dea2ba38e6d9ed7fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556741f4cc887b7c526046216bfccedeec0a4f62e97b24f4f29538e58e8f9704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4cb0a01560cd286f1a1eb344316f430de87357fc7019e4610f276ac0fcdcb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.404385 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.407392 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.407419 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.407426 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.407440 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.407449 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:58Z","lastTransitionTime":"2025-10-01T12:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.416851 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3180a2137b82f1695cd2f7e650f7d36760bc1fb62e77a131035c38edf8b9a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:56Z\\\",\\\"message\\\":\\\"2025-10-01T12:38:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_75738a46-e332-4bd0-b5b6-5560c3daeae9\\\\n2025-10-01T12:38:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_75738a46-e332-4bd0-b5b6-5560c3daeae9 to /host/opt/cni/bin/\\\\n2025-10-01T12:38:11Z [verbose] multus-daemon started\\\\n2025-10-01T12:38:11Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:38:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.432438 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.444406 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.466208 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"message\\\":\\\" fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661639 6611 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:38:40.661643 6611 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661686 6611 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 12:38:40.661748 6611 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.509919 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.509952 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.509962 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.510001 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.510016 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:58Z","lastTransitionTime":"2025-10-01T12:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.612432 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.612455 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.612463 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.612476 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.612483 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:58Z","lastTransitionTime":"2025-10-01T12:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.714676 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.714722 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.714735 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.714752 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.714763 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:58Z","lastTransitionTime":"2025-10-01T12:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.805987 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:58 crc kubenswrapper[4913]: E1001 12:38:58.806107 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.816700 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.817296 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.817334 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.817346 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.817360 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.817370 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:58Z","lastTransitionTime":"2025-10-01T12:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.826380 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.837587 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45553f35-c72e-4958-95ea-eb7eed4c742f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08aeac6119407218318ebc6ab207bf9b85b592d6a588a9dea2ba38e6d9ed7fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556741f4cc887b7c526046216bfccedeec0a4f62e97b24f4f29538e58e8f9704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4cb0a01560cd286f1a1eb344316f430de87357fc7019e4610f276ac0fcdcb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.849235 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.860398 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.871624 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.883715 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3180a2137b82f1695cd2f7e650f7d36760bc1fb62e77a131035c38edf8b9a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:56Z\\\",\\\"message\\\":\\\"2025-10-01T12:38:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_75738a46-e332-4bd0-b5b6-5560c3daeae9\\\\n2025-10-01T12:38:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_75738a46-e332-4bd0-b5b6-5560c3daeae9 to /host/opt/cni/bin/\\\\n2025-10-01T12:38:11Z [verbose] multus-daemon started\\\\n2025-10-01T12:38:11Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:38:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.895383 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.913653 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"message\\\":\\\" fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661639 6611 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:38:40.661643 6611 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661686 6611 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 12:38:40.661748 6611 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.919184 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.919212 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.919220 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.919454 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.919465 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:58Z","lastTransitionTime":"2025-10-01T12:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.924302 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.936294 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.946813 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.959551 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.968426 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.979889 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:58 crc kubenswrapper[4913]: I1001 12:38:58.989229 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.004417 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.021793 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.021820 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.021852 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.021868 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.021880 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:59Z","lastTransitionTime":"2025-10-01T12:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.124934 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.125000 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.125009 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.125021 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.125031 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:59Z","lastTransitionTime":"2025-10-01T12:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.227920 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.227947 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.227957 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.227971 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.227981 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:59Z","lastTransitionTime":"2025-10-01T12:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.330325 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.330387 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.330399 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.330414 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.330429 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:59Z","lastTransitionTime":"2025-10-01T12:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.432713 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.432749 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.432757 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.432771 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.432781 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:59Z","lastTransitionTime":"2025-10-01T12:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.535038 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.535095 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.535109 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.535126 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.535139 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:59Z","lastTransitionTime":"2025-10-01T12:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.637179 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.637233 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.637251 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.637311 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.637333 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:59Z","lastTransitionTime":"2025-10-01T12:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.739752 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.739788 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.739799 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.739813 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.739825 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:59Z","lastTransitionTime":"2025-10-01T12:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.805939 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.805980 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.806043 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:59 crc kubenswrapper[4913]: E1001 12:38:59.806224 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:59 crc kubenswrapper[4913]: E1001 12:38:59.806493 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:38:59 crc kubenswrapper[4913]: E1001 12:38:59.806637 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.841637 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.841668 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.841681 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.841694 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.841705 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:59Z","lastTransitionTime":"2025-10-01T12:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.943611 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.943666 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.943689 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.943718 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:59 crc kubenswrapper[4913]: I1001 12:38:59.943742 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:59Z","lastTransitionTime":"2025-10-01T12:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.045753 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.045794 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.045807 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.045824 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.045835 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:00Z","lastTransitionTime":"2025-10-01T12:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.148308 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.148337 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.148346 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.148381 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.148391 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:00Z","lastTransitionTime":"2025-10-01T12:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.251483 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.251515 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.251524 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.251550 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.251559 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:00Z","lastTransitionTime":"2025-10-01T12:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.353732 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.353754 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.353761 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.353774 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.353782 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:00Z","lastTransitionTime":"2025-10-01T12:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.455855 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.455891 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.455899 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.455913 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.455923 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:00Z","lastTransitionTime":"2025-10-01T12:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.557783 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.557855 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.557877 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.557901 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.557918 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:00Z","lastTransitionTime":"2025-10-01T12:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.661241 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.661300 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.661308 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.661322 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.661331 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:00Z","lastTransitionTime":"2025-10-01T12:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.763897 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.763940 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.763949 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.763962 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.763973 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:00Z","lastTransitionTime":"2025-10-01T12:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.806776 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:00 crc kubenswrapper[4913]: E1001 12:39:00.806925 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.866392 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.866426 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.866435 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.866448 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.866459 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:00Z","lastTransitionTime":"2025-10-01T12:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.969019 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.969078 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.969092 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.969113 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:00 crc kubenswrapper[4913]: I1001 12:39:00.969127 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:00Z","lastTransitionTime":"2025-10-01T12:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.071539 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.071605 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.071622 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.071645 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.071662 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:01Z","lastTransitionTime":"2025-10-01T12:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.174327 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.174385 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.174399 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.174417 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.174431 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:01Z","lastTransitionTime":"2025-10-01T12:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.276975 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.277013 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.277020 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.277037 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.277052 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:01Z","lastTransitionTime":"2025-10-01T12:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.379173 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.379215 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.379223 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.379237 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.379247 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:01Z","lastTransitionTime":"2025-10-01T12:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.481255 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.481353 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.481372 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.481397 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.481417 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:01Z","lastTransitionTime":"2025-10-01T12:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.584011 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.584082 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.584102 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.584126 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.584145 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:01Z","lastTransitionTime":"2025-10-01T12:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.686802 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.686860 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.686872 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.686895 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.686906 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:01Z","lastTransitionTime":"2025-10-01T12:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.789439 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.789489 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.789506 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.789531 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.789548 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:01Z","lastTransitionTime":"2025-10-01T12:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.805623 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.805657 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.805657 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:01 crc kubenswrapper[4913]: E1001 12:39:01.805762 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:01 crc kubenswrapper[4913]: E1001 12:39:01.805951 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:01 crc kubenswrapper[4913]: E1001 12:39:01.806125 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.892337 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.892401 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.892416 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.892435 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.892450 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:01Z","lastTransitionTime":"2025-10-01T12:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.994767 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.994807 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.994818 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.994836 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:01 crc kubenswrapper[4913]: I1001 12:39:01.994847 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:01Z","lastTransitionTime":"2025-10-01T12:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.097134 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.097178 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.097221 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.097237 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.097247 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:02Z","lastTransitionTime":"2025-10-01T12:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.200547 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.200609 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.200625 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.200648 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.200666 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:02Z","lastTransitionTime":"2025-10-01T12:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.303610 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.303655 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.303665 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.303682 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.303694 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:02Z","lastTransitionTime":"2025-10-01T12:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.406380 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.406429 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.406446 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.406466 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.406482 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:02Z","lastTransitionTime":"2025-10-01T12:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.508538 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.508575 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.508585 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.508600 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.508612 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:02Z","lastTransitionTime":"2025-10-01T12:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.611009 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.611207 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.611365 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.611467 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.611549 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:02Z","lastTransitionTime":"2025-10-01T12:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.713809 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.713855 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.713863 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.713900 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.713909 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:02Z","lastTransitionTime":"2025-10-01T12:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.806298 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:02 crc kubenswrapper[4913]: E1001 12:39:02.806501 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.816575 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.816818 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.816931 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.817123 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.817382 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:02Z","lastTransitionTime":"2025-10-01T12:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.920364 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.920647 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.920731 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.920820 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:02 crc kubenswrapper[4913]: I1001 12:39:02.920897 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:02Z","lastTransitionTime":"2025-10-01T12:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.022721 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.023026 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.023141 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.023263 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.023422 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:03Z","lastTransitionTime":"2025-10-01T12:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.126709 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.126749 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.126759 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.126776 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.126789 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:03Z","lastTransitionTime":"2025-10-01T12:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.229693 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.229754 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.229771 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.229835 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.229859 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:03Z","lastTransitionTime":"2025-10-01T12:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.332357 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.332417 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.332434 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.332458 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.332476 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:03Z","lastTransitionTime":"2025-10-01T12:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.435477 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.435526 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.435538 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.435561 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.435576 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:03Z","lastTransitionTime":"2025-10-01T12:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.538760 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.538807 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.538818 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.538832 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.538842 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:03Z","lastTransitionTime":"2025-10-01T12:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.641497 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.641560 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.641578 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.641603 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.641620 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:03Z","lastTransitionTime":"2025-10-01T12:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.744201 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.744259 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.744333 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.744364 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.744387 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:03Z","lastTransitionTime":"2025-10-01T12:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.806136 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.806172 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.806234 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:03 crc kubenswrapper[4913]: E1001 12:39:03.806346 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:03 crc kubenswrapper[4913]: E1001 12:39:03.806660 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:03 crc kubenswrapper[4913]: E1001 12:39:03.806800 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.847480 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.847517 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.847525 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.847540 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.847551 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:03Z","lastTransitionTime":"2025-10-01T12:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.949877 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.949931 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.949947 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.949973 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:03 crc kubenswrapper[4913]: I1001 12:39:03.949990 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:03Z","lastTransitionTime":"2025-10-01T12:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.052825 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.052896 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.052919 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.052957 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.052992 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:04Z","lastTransitionTime":"2025-10-01T12:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.155786 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.155865 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.155889 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.155920 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.155944 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:04Z","lastTransitionTime":"2025-10-01T12:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.258609 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.258688 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.258712 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.258742 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.258765 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:04Z","lastTransitionTime":"2025-10-01T12:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.361171 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.361229 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.361246 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.361312 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.361341 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:04Z","lastTransitionTime":"2025-10-01T12:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.463909 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.463974 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.463993 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.464018 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.464034 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:04Z","lastTransitionTime":"2025-10-01T12:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.566961 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.567016 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.567028 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.567046 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.567055 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:04Z","lastTransitionTime":"2025-10-01T12:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.670125 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.670171 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.670183 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.670202 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.670218 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:04Z","lastTransitionTime":"2025-10-01T12:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.772674 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.772753 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.772776 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.772809 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.772833 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:04Z","lastTransitionTime":"2025-10-01T12:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.806316 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:04 crc kubenswrapper[4913]: E1001 12:39:04.806498 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.875500 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.875533 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.875542 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.875556 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.875568 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:04Z","lastTransitionTime":"2025-10-01T12:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.978356 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.978387 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.978397 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.978412 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:04 crc kubenswrapper[4913]: I1001 12:39:04.978422 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:04Z","lastTransitionTime":"2025-10-01T12:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.080212 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.080558 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.080654 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.080754 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.080859 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:05Z","lastTransitionTime":"2025-10-01T12:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.183825 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.184107 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.184256 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.184425 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.184541 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:05Z","lastTransitionTime":"2025-10-01T12:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.287759 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.287821 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.287838 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.287865 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.287887 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:05Z","lastTransitionTime":"2025-10-01T12:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.390964 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.391001 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.391011 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.391026 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.391037 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:05Z","lastTransitionTime":"2025-10-01T12:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.493101 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.493816 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.493910 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.494022 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.494120 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:05Z","lastTransitionTime":"2025-10-01T12:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.543420 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.543657 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.543775 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.543870 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.543965 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:05Z","lastTransitionTime":"2025-10-01T12:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:05 crc kubenswrapper[4913]: E1001 12:39:05.558825 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:05Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.562446 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.562501 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.562516 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.562539 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.562551 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:05Z","lastTransitionTime":"2025-10-01T12:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:05 crc kubenswrapper[4913]: E1001 12:39:05.576251 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:05Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.579681 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.579825 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.579922 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.580018 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.580114 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:05Z","lastTransitionTime":"2025-10-01T12:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:05 crc kubenswrapper[4913]: E1001 12:39:05.593721 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:05Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.599921 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.600414 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.600482 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.600570 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.600630 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:05Z","lastTransitionTime":"2025-10-01T12:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:05 crc kubenswrapper[4913]: E1001 12:39:05.616974 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:05Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.622332 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.622385 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.622396 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.622413 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.622427 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:05Z","lastTransitionTime":"2025-10-01T12:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:05 crc kubenswrapper[4913]: E1001 12:39:05.637336 4913 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:39:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9b5c0a1c-ebf9-497b-a54a-617d247fddf8\\\",\\\"systemUUID\\\":\\\"35eb2588-e911-475f-90ba-39d796ce691f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:05Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:05 crc kubenswrapper[4913]: E1001 12:39:05.637484 4913 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.639680 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.639723 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.639735 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.639755 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.639766 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:05Z","lastTransitionTime":"2025-10-01T12:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.742613 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.742662 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.742673 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.742694 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.742706 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:05Z","lastTransitionTime":"2025-10-01T12:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.805979 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:05 crc kubenswrapper[4913]: E1001 12:39:05.806673 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.806106 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:05 crc kubenswrapper[4913]: E1001 12:39:05.806774 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.806084 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:05 crc kubenswrapper[4913]: E1001 12:39:05.806874 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.844882 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.844919 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.844929 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.844944 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.844955 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:05Z","lastTransitionTime":"2025-10-01T12:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.947636 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.947684 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.947700 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.947718 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:05 crc kubenswrapper[4913]: I1001 12:39:05.947732 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:05Z","lastTransitionTime":"2025-10-01T12:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.049704 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.049735 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.049745 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.049759 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.049777 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:06Z","lastTransitionTime":"2025-10-01T12:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.152356 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.152424 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.152436 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.152452 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.152463 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:06Z","lastTransitionTime":"2025-10-01T12:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.254676 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.254719 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.254729 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.254745 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.254756 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:06Z","lastTransitionTime":"2025-10-01T12:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.357531 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.357585 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.357601 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.357621 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.357635 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:06Z","lastTransitionTime":"2025-10-01T12:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.459951 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.459989 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.460001 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.460017 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.460027 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:06Z","lastTransitionTime":"2025-10-01T12:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.562619 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.562659 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.562671 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.562685 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.562694 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:06Z","lastTransitionTime":"2025-10-01T12:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.665050 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.665086 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.665097 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.665112 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.665124 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:06Z","lastTransitionTime":"2025-10-01T12:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.766684 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.766717 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.766727 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.766740 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.766749 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:06Z","lastTransitionTime":"2025-10-01T12:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.805841 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:06 crc kubenswrapper[4913]: E1001 12:39:06.805995 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.868635 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.868680 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.868690 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.868706 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.868717 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:06Z","lastTransitionTime":"2025-10-01T12:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.971057 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.971092 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.971100 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.971112 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:06 crc kubenswrapper[4913]: I1001 12:39:06.971123 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:06Z","lastTransitionTime":"2025-10-01T12:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.073662 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.073740 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.073761 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.073787 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.073805 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:07Z","lastTransitionTime":"2025-10-01T12:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.176031 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.176091 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.176108 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.176132 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.176154 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:07Z","lastTransitionTime":"2025-10-01T12:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.279026 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.279095 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.279104 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.279121 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.279133 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:07Z","lastTransitionTime":"2025-10-01T12:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.381583 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.381645 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.381668 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.381696 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.381717 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:07Z","lastTransitionTime":"2025-10-01T12:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.484444 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.484506 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.484526 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.484552 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.484576 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:07Z","lastTransitionTime":"2025-10-01T12:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.587581 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.587655 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.587678 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.587707 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.587728 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:07Z","lastTransitionTime":"2025-10-01T12:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.690813 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.690886 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.690904 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.690930 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.690949 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:07Z","lastTransitionTime":"2025-10-01T12:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.793659 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.793718 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.793737 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.793760 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.793775 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:07Z","lastTransitionTime":"2025-10-01T12:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.806211 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.806227 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:07 crc kubenswrapper[4913]: E1001 12:39:07.806360 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.806447 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:07 crc kubenswrapper[4913]: E1001 12:39:07.806492 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:07 crc kubenswrapper[4913]: E1001 12:39:07.806644 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.897630 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.897703 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.897721 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.898127 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:07 crc kubenswrapper[4913]: I1001 12:39:07.898184 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:07Z","lastTransitionTime":"2025-10-01T12:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.001252 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.001456 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.001534 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.001570 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.001645 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:08Z","lastTransitionTime":"2025-10-01T12:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.104916 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.104995 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.105007 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.105023 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.105035 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:08Z","lastTransitionTime":"2025-10-01T12:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.207539 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.207591 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.207609 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.207632 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.207648 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:08Z","lastTransitionTime":"2025-10-01T12:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.310407 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.310450 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.310460 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.310477 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.310489 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:08Z","lastTransitionTime":"2025-10-01T12:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.412627 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.412686 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.412704 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.412728 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.412746 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:08Z","lastTransitionTime":"2025-10-01T12:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.514857 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.514907 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.514919 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.514960 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.514975 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:08Z","lastTransitionTime":"2025-10-01T12:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.617457 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.617514 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.617527 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.617545 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.617555 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:08Z","lastTransitionTime":"2025-10-01T12:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.719296 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.719335 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.719346 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.719363 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.719374 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:08Z","lastTransitionTime":"2025-10-01T12:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.806425 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:08 crc kubenswrapper[4913]: E1001 12:39:08.806547 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.819130 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.821726 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.821758 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.821766 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.821779 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.821788 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:08Z","lastTransitionTime":"2025-10-01T12:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.829151 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.839623 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.854135 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.866097 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.879016 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.890403 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.901946 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.917652 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.923749 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.923784 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.923798 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.923813 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.923833 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:08Z","lastTransitionTime":"2025-10-01T12:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.928602 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.938984 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45553f35-c72e-4958-95ea-eb7eed4c742f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08aeac6119407218318ebc6ab207bf9b85b592d6a588a9dea2ba38e6d9ed7fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556741f4cc887b7c526046216bfccedeec0a4f62e97b24f4f29538e58e8f9704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4cb0a01560cd286f1a1eb344316f430de87357fc7019e4610f276ac0fcdcb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.952343 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.964200 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.975768 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3180a2137b82f1695cd2f7e650f7d36760bc1fb62e77a131035c38edf8b9a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:56Z\\\",\\\"message\\\":\\\"2025-10-01T12:38:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_75738a46-e332-4bd0-b5b6-5560c3daeae9\\\\n2025-10-01T12:38:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_75738a46-e332-4bd0-b5b6-5560c3daeae9 to /host/opt/cni/bin/\\\\n2025-10-01T12:38:11Z [verbose] multus-daemon started\\\\n2025-10-01T12:38:11Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:38:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:08 crc kubenswrapper[4913]: I1001 12:39:08.991370 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:08Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.010178 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"message\\\":\\\" fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661639 6611 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:38:40.661643 6611 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661686 6611 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 12:38:40.661748 6611 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.026854 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.026891 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.026903 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.026854 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:09Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.026964 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.026978 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:09Z","lastTransitionTime":"2025-10-01T12:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.129975 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.130056 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.130094 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.130122 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.130142 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:09Z","lastTransitionTime":"2025-10-01T12:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.232898 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.232980 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.233004 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.233035 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.233057 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:09Z","lastTransitionTime":"2025-10-01T12:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.335166 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.335229 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.335248 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.335302 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.335320 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:09Z","lastTransitionTime":"2025-10-01T12:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.440718 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.440793 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.440815 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.440846 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.440871 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:09Z","lastTransitionTime":"2025-10-01T12:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.544726 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.544801 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.544825 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.544855 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.544878 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:09Z","lastTransitionTime":"2025-10-01T12:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.647396 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.647453 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.647470 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.647495 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.647511 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:09Z","lastTransitionTime":"2025-10-01T12:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.749796 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.749828 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.749852 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.749865 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.749873 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:09Z","lastTransitionTime":"2025-10-01T12:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.805910 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.805928 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.805991 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:09 crc kubenswrapper[4913]: E1001 12:39:09.806069 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:09 crc kubenswrapper[4913]: E1001 12:39:09.806315 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:09 crc kubenswrapper[4913]: E1001 12:39:09.806349 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.852618 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.852671 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.852688 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.852710 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.852730 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:09Z","lastTransitionTime":"2025-10-01T12:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.955577 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.955632 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.955648 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.955665 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:09 crc kubenswrapper[4913]: I1001 12:39:09.955677 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:09Z","lastTransitionTime":"2025-10-01T12:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.058160 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.058214 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.058230 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.058254 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.058300 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:10Z","lastTransitionTime":"2025-10-01T12:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.160188 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.160222 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.160230 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.160245 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.160254 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:10Z","lastTransitionTime":"2025-10-01T12:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.262684 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.262737 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.262751 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.262770 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.262785 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:10Z","lastTransitionTime":"2025-10-01T12:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.366627 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.366665 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.366681 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.366697 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.366708 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:10Z","lastTransitionTime":"2025-10-01T12:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.469328 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.469371 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.469382 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.469398 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.469410 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:10Z","lastTransitionTime":"2025-10-01T12:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.571778 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.571834 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.571846 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.571861 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.571873 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:10Z","lastTransitionTime":"2025-10-01T12:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.674586 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.674623 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.674634 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.674648 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.674658 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:10Z","lastTransitionTime":"2025-10-01T12:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.777367 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.777404 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.777414 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.777427 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.777437 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:10Z","lastTransitionTime":"2025-10-01T12:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.805987 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:10 crc kubenswrapper[4913]: E1001 12:39:10.806147 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.807364 4913 scope.go:117] "RemoveContainer" containerID="a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.879576 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.879614 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.879625 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.879650 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.879662 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:10Z","lastTransitionTime":"2025-10-01T12:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.982011 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.982048 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.982060 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.982076 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:10 crc kubenswrapper[4913]: I1001 12:39:10.982088 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:10Z","lastTransitionTime":"2025-10-01T12:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.083915 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.083953 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.083961 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.083974 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.083983 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:11Z","lastTransitionTime":"2025-10-01T12:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.191346 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.191385 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.191395 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.191412 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.191422 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:11Z","lastTransitionTime":"2025-10-01T12:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.269779 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/2.log" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.272419 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerStarted","Data":"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd"} Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.273086 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.283876 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45553f35-c72e-4958-95ea-eb7eed4c742f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08aeac6119407218318ebc6ab207bf9b85b592d6a588a9dea2ba38e6d9ed7fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556741f4cc887b7c526046216bfccedeec0a4f62e97b24f4f29538e58e8f9704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4cb0a01560cd286f1a1eb344316f430de87357fc7019e4610f276ac0fcdcb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.293837 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.293875 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.293883 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.293897 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.293907 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:11Z","lastTransitionTime":"2025-10-01T12:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.296130 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.307681 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.318146 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.329117 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.339883 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.357040 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3180a2137b82f1695cd2f7e650f7d36760bc1fb62e77a131035c38edf8b9a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:56Z\\\",\\\"message\\\":\\\"2025-10-01T12:38:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_75738a46-e332-4bd0-b5b6-5560c3daeae9\\\\n2025-10-01T12:38:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_75738a46-e332-4bd0-b5b6-5560c3daeae9 to /host/opt/cni/bin/\\\\n2025-10-01T12:38:11Z [verbose] multus-daemon started\\\\n2025-10-01T12:38:11Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:38:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.369879 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.393394 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"message\\\":\\\" fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661639 6611 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:38:40.661643 6611 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661686 6611 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 12:38:40.661748 6611 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.396186 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.396244 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.396258 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.396308 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.396325 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:11Z","lastTransitionTime":"2025-10-01T12:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.413995 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.428763 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.440542 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.453588 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.467650 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.483496 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.494996 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.498842 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.498888 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.498899 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.498919 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.498931 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:11Z","lastTransitionTime":"2025-10-01T12:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.509093 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.601901 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.601933 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.601940 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.601973 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.601983 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:11Z","lastTransitionTime":"2025-10-01T12:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.705738 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.706057 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.706065 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.706081 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.706092 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:11Z","lastTransitionTime":"2025-10-01T12:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.806572 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.806610 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.806698 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.806572 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.806942 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.806993 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.808112 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.808170 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.808455 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.808485 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.808505 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:11Z","lastTransitionTime":"2025-10-01T12:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.910887 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.910964 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.910988 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.911023 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.911050 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:11Z","lastTransitionTime":"2025-10-01T12:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.948080 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.948206 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.948246 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.948309 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:11 crc kubenswrapper[4913]: I1001 12:39:11.948331 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.948383 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.948351465 +0000 UTC m=+147.851827073 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.948436 4913 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.948474 4913 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.948486 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.948473638 +0000 UTC m=+147.851949216 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.948508 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.948708 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.948513659 +0000 UTC m=+147.851989267 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.948730 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.948756 4913 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.948788 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.948837 4913 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.948845 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.948818518 +0000 UTC m=+147.852294176 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.948861 4913 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:39:11 crc kubenswrapper[4913]: E1001 12:39:11.948932 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.948910891 +0000 UTC m=+147.852386539 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.014592 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.014658 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.014675 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.014699 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.014717 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:12Z","lastTransitionTime":"2025-10-01T12:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.118131 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.118178 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.118192 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.118212 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.118225 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:12Z","lastTransitionTime":"2025-10-01T12:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.221526 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.221593 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.221612 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.221639 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.221658 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:12Z","lastTransitionTime":"2025-10-01T12:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.278737 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/3.log" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.280206 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/2.log" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.284379 4913 generic.go:334] "Generic (PLEG): container finished" podID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerID="090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd" exitCode=1 Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.284422 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerDied","Data":"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd"} Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.284460 4913 scope.go:117] "RemoveContainer" containerID="a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.287569 4913 scope.go:117] "RemoveContainer" containerID="090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd" Oct 01 12:39:12 crc kubenswrapper[4913]: E1001 12:39:12.288264 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.306382 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3180a2137b82f1695cd2f7e650f7d36760bc1fb62e77a131035c38edf8b9a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:56Z\\\",\\\"message\\\":\\\"2025-10-01T12:38:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_75738a46-e332-4bd0-b5b6-5560c3daeae9\\\\n2025-10-01T12:38:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_75738a46-e332-4bd0-b5b6-5560c3daeae9 to /host/opt/cni/bin/\\\\n2025-10-01T12:38:11Z [verbose] multus-daemon started\\\\n2025-10-01T12:38:11Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:38:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.324052 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.324133 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.324146 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.324166 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.324180 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:12Z","lastTransitionTime":"2025-10-01T12:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.330933 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.363168 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5cee4671cf90f9b6968171945586be3558bd1d58c53d70eeae47b04b4cfab4d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"message\\\":\\\" fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661639 6611 ovnkube.go:599] Stopped ovnkube\\\\nI1001 12:38:40.661643 6611 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:38:40.661686 6611 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 12:38:40.661748 6611 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:39:11Z\\\",\\\"message\\\":\\\" crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z]\\\\nI1001 12:39:11.634769 7040 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1001 12:39:11.634663 7040 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/packageserver-service]} name:Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.153:5443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:39:11.634800 7040 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:39:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.392771 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.415243 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.427003 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.427060 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.427078 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.427101 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.427118 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:12Z","lastTransitionTime":"2025-10-01T12:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.436562 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.457460 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.472908 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.491581 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.510621 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.528080 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.530221 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.530295 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.530313 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.530337 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.530354 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:12Z","lastTransitionTime":"2025-10-01T12:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.545557 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.558513 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.571571 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45553f35-c72e-4958-95ea-eb7eed4c742f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08aeac6119407218318ebc6ab207bf9b85b592d6a588a9dea2ba38e6d9ed7fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556741f4cc887b7c526046216bfccedeec0a4f62e97b24f4f29538e58e8f9704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4cb0a01560cd286f1a1eb344316f430de87357fc7019e4610f276ac0fcdcb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.588416 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.604383 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.615195 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.632835 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.632932 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.632963 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.632998 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.633021 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:12Z","lastTransitionTime":"2025-10-01T12:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.735957 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.736016 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.736029 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.736046 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.736060 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:12Z","lastTransitionTime":"2025-10-01T12:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.806133 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:12 crc kubenswrapper[4913]: E1001 12:39:12.806325 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.839326 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.839364 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.839377 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.839395 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.839408 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:12Z","lastTransitionTime":"2025-10-01T12:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.942061 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.942114 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.942130 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.942153 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:12 crc kubenswrapper[4913]: I1001 12:39:12.942169 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:12Z","lastTransitionTime":"2025-10-01T12:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.045166 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.045224 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.045241 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.045290 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.045308 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:13Z","lastTransitionTime":"2025-10-01T12:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.148750 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.148810 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.148829 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.148855 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.148873 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:13Z","lastTransitionTime":"2025-10-01T12:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.252083 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.252125 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.252133 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.252150 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.252159 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:13Z","lastTransitionTime":"2025-10-01T12:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.290516 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/3.log" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.294760 4913 scope.go:117] "RemoveContainer" containerID="090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd" Oct 01 12:39:13 crc kubenswrapper[4913]: E1001 12:39:13.295003 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.312314 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45553f35-c72e-4958-95ea-eb7eed4c742f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08aeac6119407218318ebc6ab207bf9b85b592d6a588a9dea2ba38e6d9ed7fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556741f4cc887b7c526046216bfccedeec0a4f62e97b24f4f29538e58e8f9704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4cb0a01560cd286f1a1eb344316f430de87357fc7019e4610f276ac0fcdcb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57645ca7626cac6d83cff2a6e0039afa5db4260e737c22eacab730d5e3ffbeee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.327656 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7e22f13b0c07b75d6a1e2027ad0f67dae5c2888d2a4738a050b53145d00eef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d5614ad3ece801d54d7214a99a2393dd60c62f3b3993b23e75ac0bb50622b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.348513 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.354162 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.354219 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.354241 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.354295 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.354317 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:13Z","lastTransitionTime":"2025-10-01T12:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.361621 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t7565" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfb3767-c920-41f1-9c7b-88828a9a4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed23fa9e16e89a075e367ad30cbe5016fe57fdd056bbed21405335a73256b58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xhsm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t7565\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.372853 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8903e6e-381f-4f5c-b9c5-5242c3de2897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5175da7ade2fae53a54c3abb2d02c6df23fbf9334389b48fb4f1fc307a693e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-td6wp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8hltg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.386086 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kfdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8c8wp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.402033 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqn52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2420adf-64bd-4d67-ac95-9337ed10149a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3180a2137b82f1695cd2f7e650f7d36760bc1fb62e77a131035c38edf8b9a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:56Z\\\",\\\"message\\\":\\\"2025-10-01T12:38:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_75738a46-e332-4bd0-b5b6-5560c3daeae9\\\\n2025-10-01T12:38:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_75738a46-e332-4bd0-b5b6-5560c3daeae9 to /host/opt/cni/bin/\\\\n2025-10-01T12:38:11Z [verbose] multus-daemon started\\\\n2025-10-01T12:38:11Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:38:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f76l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqn52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.413596 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.434240 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:39:11Z\\\",\\\"message\\\":\\\" crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:11Z is after 2025-08-24T17:21:41Z]\\\\nI1001 12:39:11.634769 7040 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1001 12:39:11.634663 7040 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/packageserver-service]} name:Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.153:5443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 12:39:11.634800 7040 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:39:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-57qvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.446225 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"687c12dc-5e8c-41f7-a962-c0b2cdd2a3cb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://702a7eb58ce8f3863609d59dde10bfc9f5029cc1a8367c8f5df4a2533e76db94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbcb11bc51d7e8bf137a9fffd96ab2ea99985abf3c9bef48611441b16ebead83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgd7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zbmtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.457627 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.457670 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.457682 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.457701 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.457713 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:13Z","lastTransitionTime":"2025-10-01T12:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.460209 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0219f135-adcb-41cf-a30c-719dc8b8e8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a50602f46928140866d3695af83252c66cda13c6f3b89788bd50795668549f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c4dfdd83a1ef3b1b1d8fbe569a67e550ddceef1de8d2b45da1ba542658d44cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b28799a3973024a1be593d83e5e48e689898fc36d175f25dc3c4ca96a8f52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a4066b15b2046cb6e592e4ab215a27f83ca491957dbc55efd6fcd360e13ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4d5b73cc13eeaf33c329578a394afe5f9c12c58a7f8db5dce1315711447926a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8242c43fc44e1a2bf8fac0599a88e349807ff08017a803057e41999eca22edfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0067467bf661d660e2ddc91c0c61ec451c3eb00f7457ced919c167453dd971\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:38:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlshf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7v2m5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.473954 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z8555" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23f43f91-4983-4348-926f-4dbcafbbaa18\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3539d98df5d2906c545a37c1dab102091da653cfc4fa7cbb31ab6d0b1d18f2ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7z6lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:38:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z8555\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.490107 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81cdc666-140e-4827-8034-5fb08c56d4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e964c44de670099c308c3c4ac4b2bcb896ab840521e4018515eebd5834ac0f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9743338a78bb99302706849f2ea0da8685ab830b4af737b8a45a97e66e26da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfb2f9e4f5edead2fec97db541ed0d291a890092cda0597506c35b58051e67d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d46a445af730179cb8c6d42200c86e26baea6677ac6a87b21fe44d0bc34e18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce5a948e31f6f4ef84c98b0b5ae9059a328803bbbe43cc9fd2d781f33c12125b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"message\\\":\\\"file observer\\\\nW1001 12:38:08.558293 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 12:38:08.558413 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:38:08.559062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-735312337/tls.crt::/tmp/serving-cert-735312337/tls.key\\\\\\\"\\\\nI1001 12:38:08.775879 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:38:08.778232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:38:08.778246 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:38:08.778260 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:38:08.778280 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:38:08.785215 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1001 12:38:08.785229 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:38:08.785241 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785248 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:38:08.785252 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:38:08.785255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:38:08.785258 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:38:08.785261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:38:08.787078 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b26e9a4b61874e73788eb70c84be05d1ff93c22b3ffab9c22adfc110d99fe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b070b02198871438cb7c66bd1bf3b667179bacff94369571570c131ae0e1d4c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.506241 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6e0ca3c-326b-45cc-884f-eed652a8fe83\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35607751ade7442d8d4416ce5768b60fa060cc0470cbf0304ae79db058958efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e3d9ef1a39554e6fc775dad4f9d085512ce5ab8ee48db370b0f383d5851df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2eddbd36455a6f647ac0ab79919312c2fd321482fb1b45be563de22b7def42e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://661f7f933617a3796090980e012ce85ba98120865a7c0207c418e8f09acb4f76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.521202 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2081ec2c0d190567ae4da7e966bbf22fea95cd0ba5bc67393712954526df3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.533754 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ead0af007b7e7422464c1b19358d624dac8c4310cffc9ec5f17d111fd6cc18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.547195 4913 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:39:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.559585 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.559623 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.559637 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.559657 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.559671 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:13Z","lastTransitionTime":"2025-10-01T12:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.661879 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.661926 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.661944 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.661964 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.661981 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:13Z","lastTransitionTime":"2025-10-01T12:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.765263 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.765339 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.765350 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.765367 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.765380 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:13Z","lastTransitionTime":"2025-10-01T12:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.806530 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.806584 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.806607 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:13 crc kubenswrapper[4913]: E1001 12:39:13.806709 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:13 crc kubenswrapper[4913]: E1001 12:39:13.806800 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:13 crc kubenswrapper[4913]: E1001 12:39:13.806926 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.867943 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.867994 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.868012 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.868036 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.868055 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:13Z","lastTransitionTime":"2025-10-01T12:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.970204 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.970242 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.970253 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.970302 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:13 crc kubenswrapper[4913]: I1001 12:39:13.970315 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:13Z","lastTransitionTime":"2025-10-01T12:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.072005 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.072048 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.072060 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.072076 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.072086 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:14Z","lastTransitionTime":"2025-10-01T12:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.174665 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.174730 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.174739 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.174770 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.174781 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:14Z","lastTransitionTime":"2025-10-01T12:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.277160 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.277225 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.277238 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.277299 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.277320 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:14Z","lastTransitionTime":"2025-10-01T12:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.379972 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.380021 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.380032 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.380048 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.380058 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:14Z","lastTransitionTime":"2025-10-01T12:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.482914 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.482960 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.482970 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.482988 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.482999 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:14Z","lastTransitionTime":"2025-10-01T12:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.585255 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.585303 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.585312 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.585328 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.585338 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:14Z","lastTransitionTime":"2025-10-01T12:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.687434 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.687471 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.687483 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.687499 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.687512 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:14Z","lastTransitionTime":"2025-10-01T12:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.790084 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.790116 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.790124 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.790137 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.790147 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:14Z","lastTransitionTime":"2025-10-01T12:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.806606 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:14 crc kubenswrapper[4913]: E1001 12:39:14.806768 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.892864 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.892918 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.892936 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.892961 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.893009 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:14Z","lastTransitionTime":"2025-10-01T12:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.995463 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.995521 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.995530 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.995547 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:14 crc kubenswrapper[4913]: I1001 12:39:14.995565 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:14Z","lastTransitionTime":"2025-10-01T12:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.097664 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.097698 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.097708 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.097725 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.097736 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:15Z","lastTransitionTime":"2025-10-01T12:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.199923 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.199972 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.199983 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.199996 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.200004 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:15Z","lastTransitionTime":"2025-10-01T12:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.302086 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.302201 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.302212 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.302226 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.302237 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:15Z","lastTransitionTime":"2025-10-01T12:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.404518 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.404578 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.404589 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.404607 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.404619 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:15Z","lastTransitionTime":"2025-10-01T12:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.507223 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.507308 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.507334 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.507363 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.507425 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:15Z","lastTransitionTime":"2025-10-01T12:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.609375 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.609429 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.609440 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.609457 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.609468 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:15Z","lastTransitionTime":"2025-10-01T12:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.711436 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.711478 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.711486 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.711500 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.711508 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:15Z","lastTransitionTime":"2025-10-01T12:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.806044 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.806058 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.806196 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:15 crc kubenswrapper[4913]: E1001 12:39:15.806353 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:15 crc kubenswrapper[4913]: E1001 12:39:15.806402 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:15 crc kubenswrapper[4913]: E1001 12:39:15.806485 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.813459 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.813501 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.813513 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.813527 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.813542 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:15Z","lastTransitionTime":"2025-10-01T12:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.916161 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.916220 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.916235 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.916257 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:15 crc kubenswrapper[4913]: I1001 12:39:15.916296 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:15Z","lastTransitionTime":"2025-10-01T12:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.018494 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.018530 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.018541 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.018557 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.018566 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:16Z","lastTransitionTime":"2025-10-01T12:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.034406 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.034484 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.034543 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.034559 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.034573 4913 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:39:16Z","lastTransitionTime":"2025-10-01T12:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.084213 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm"] Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.084594 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.086732 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.088144 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.089023 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.089258 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.106448 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=63.106432179 podStartE2EDuration="1m3.106432179s" podCreationTimestamp="2025-10-01 12:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:16.106374038 +0000 UTC m=+88.009849626" watchObservedRunningTime="2025-10-01 12:39:16.106432179 +0000 UTC m=+88.009907757" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.167796 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7v2m5" podStartSLOduration=67.167777604 podStartE2EDuration="1m7.167777604s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:16.167731692 +0000 UTC m=+88.071207270" watchObservedRunningTime="2025-10-01 12:39:16.167777604 +0000 UTC m=+88.071253202" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.178242 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-z8555" podStartSLOduration=67.178220003 podStartE2EDuration="1m7.178220003s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:16.17809813 +0000 UTC m=+88.081573708" watchObservedRunningTime="2025-10-01 12:39:16.178220003 +0000 UTC m=+88.081695581" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.190439 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39d79742-08a7-4bad-8b0c-69d91520fb66-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.190523 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/39d79742-08a7-4bad-8b0c-69d91520fb66-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.190551 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39d79742-08a7-4bad-8b0c-69d91520fb66-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.190580 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/39d79742-08a7-4bad-8b0c-69d91520fb66-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.190628 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39d79742-08a7-4bad-8b0c-69d91520fb66-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.191835 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.191821941 podStartE2EDuration="1m8.191821941s" podCreationTimestamp="2025-10-01 12:38:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:16.191077661 +0000 UTC m=+88.094553259" watchObservedRunningTime="2025-10-01 12:39:16.191821941 +0000 UTC m=+88.095297529" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.211693 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-t7565" podStartSLOduration=67.211673412 podStartE2EDuration="1m7.211673412s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:16.211531759 +0000 UTC m=+88.115007337" watchObservedRunningTime="2025-10-01 12:39:16.211673412 +0000 UTC m=+88.115148990" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.223316 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podStartSLOduration=67.223295825 podStartE2EDuration="1m7.223295825s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:16.222841053 +0000 UTC m=+88.126316891" watchObservedRunningTime="2025-10-01 12:39:16.223295825 +0000 UTC m=+88.126771403" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.265773 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=40.265755596 podStartE2EDuration="40.265755596s" podCreationTimestamp="2025-10-01 12:38:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:16.247098637 +0000 UTC m=+88.150574215" watchObservedRunningTime="2025-10-01 12:39:16.265755596 +0000 UTC m=+88.169231174" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.291984 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/39d79742-08a7-4bad-8b0c-69d91520fb66-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.292020 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39d79742-08a7-4bad-8b0c-69d91520fb66-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.292036 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/39d79742-08a7-4bad-8b0c-69d91520fb66-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.292066 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39d79742-08a7-4bad-8b0c-69d91520fb66-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.292098 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39d79742-08a7-4bad-8b0c-69d91520fb66-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.292623 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/39d79742-08a7-4bad-8b0c-69d91520fb66-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.292969 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/39d79742-08a7-4bad-8b0c-69d91520fb66-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.293421 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39d79742-08a7-4bad-8b0c-69d91520fb66-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.299656 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39d79742-08a7-4bad-8b0c-69d91520fb66-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.303337 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zqn52" podStartSLOduration=67.303326579 podStartE2EDuration="1m7.303326579s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:16.289381541 +0000 UTC m=+88.192857119" watchObservedRunningTime="2025-10-01 12:39:16.303326579 +0000 UTC m=+88.206802157" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.303539 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zbmtj" podStartSLOduration=67.303535565 podStartE2EDuration="1m7.303535565s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:16.303458132 +0000 UTC m=+88.206933720" watchObservedRunningTime="2025-10-01 12:39:16.303535565 +0000 UTC m=+88.207011143" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.314216 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39d79742-08a7-4bad-8b0c-69d91520fb66-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pbtmm\" (UID: \"39d79742-08a7-4bad-8b0c-69d91520fb66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.400665 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.805650 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:16 crc kubenswrapper[4913]: E1001 12:39:16.805775 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:16 crc kubenswrapper[4913]: I1001 12:39:16.821359 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 01 12:39:17 crc kubenswrapper[4913]: I1001 12:39:17.305340 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" event={"ID":"39d79742-08a7-4bad-8b0c-69d91520fb66","Type":"ContainerStarted","Data":"9d38caa65e2245ba48126470709fedc61fd53ec5ecb77674335f3c56cb4c8953"} Oct 01 12:39:17 crc kubenswrapper[4913]: I1001 12:39:17.305426 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" event={"ID":"39d79742-08a7-4bad-8b0c-69d91520fb66","Type":"ContainerStarted","Data":"b6e9972cc1047245710ccfacb614e4a090bab6009ee0fea1e53b607c62062998"} Oct 01 12:39:17 crc kubenswrapper[4913]: I1001 12:39:17.323885 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pbtmm" podStartSLOduration=68.32386671 podStartE2EDuration="1m8.32386671s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:17.322201234 +0000 UTC m=+89.225676842" watchObservedRunningTime="2025-10-01 12:39:17.32386671 +0000 UTC m=+89.227342288" Oct 01 12:39:17 crc kubenswrapper[4913]: I1001 12:39:17.346875 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.346859278 podStartE2EDuration="1.346859278s" podCreationTimestamp="2025-10-01 12:39:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:17.345641435 +0000 UTC m=+89.249117023" watchObservedRunningTime="2025-10-01 12:39:17.346859278 +0000 UTC m=+89.250334866" Oct 01 12:39:17 crc kubenswrapper[4913]: I1001 12:39:17.805821 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:17 crc kubenswrapper[4913]: I1001 12:39:17.805873 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:17 crc kubenswrapper[4913]: I1001 12:39:17.805936 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:17 crc kubenswrapper[4913]: E1001 12:39:17.806016 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:17 crc kubenswrapper[4913]: E1001 12:39:17.806082 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:17 crc kubenswrapper[4913]: E1001 12:39:17.806153 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:18 crc kubenswrapper[4913]: I1001 12:39:18.806702 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:18 crc kubenswrapper[4913]: E1001 12:39:18.806826 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:19 crc kubenswrapper[4913]: I1001 12:39:19.805841 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:19 crc kubenswrapper[4913]: I1001 12:39:19.805847 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:19 crc kubenswrapper[4913]: I1001 12:39:19.805841 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:19 crc kubenswrapper[4913]: E1001 12:39:19.806093 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:19 crc kubenswrapper[4913]: E1001 12:39:19.806180 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:19 crc kubenswrapper[4913]: E1001 12:39:19.805988 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:20 crc kubenswrapper[4913]: I1001 12:39:20.805724 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:20 crc kubenswrapper[4913]: E1001 12:39:20.805880 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:21 crc kubenswrapper[4913]: I1001 12:39:21.806235 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:21 crc kubenswrapper[4913]: E1001 12:39:21.806964 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:21 crc kubenswrapper[4913]: I1001 12:39:21.806439 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:21 crc kubenswrapper[4913]: E1001 12:39:21.807199 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:21 crc kubenswrapper[4913]: I1001 12:39:21.806258 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:21 crc kubenswrapper[4913]: E1001 12:39:21.807475 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:22 crc kubenswrapper[4913]: I1001 12:39:22.806459 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:22 crc kubenswrapper[4913]: E1001 12:39:22.806640 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:22 crc kubenswrapper[4913]: I1001 12:39:22.819230 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 01 12:39:23 crc kubenswrapper[4913]: I1001 12:39:23.806259 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:23 crc kubenswrapper[4913]: I1001 12:39:23.806450 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:23 crc kubenswrapper[4913]: I1001 12:39:23.806461 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:23 crc kubenswrapper[4913]: E1001 12:39:23.806581 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:23 crc kubenswrapper[4913]: E1001 12:39:23.806715 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:23 crc kubenswrapper[4913]: E1001 12:39:23.807086 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:23 crc kubenswrapper[4913]: I1001 12:39:23.807380 4913 scope.go:117] "RemoveContainer" containerID="090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd" Oct 01 12:39:23 crc kubenswrapper[4913]: E1001 12:39:23.807636 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" Oct 01 12:39:24 crc kubenswrapper[4913]: I1001 12:39:24.806154 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:24 crc kubenswrapper[4913]: E1001 12:39:24.806306 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:25 crc kubenswrapper[4913]: I1001 12:39:25.805935 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:25 crc kubenswrapper[4913]: I1001 12:39:25.805961 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:25 crc kubenswrapper[4913]: I1001 12:39:25.805960 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:25 crc kubenswrapper[4913]: E1001 12:39:25.806093 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:25 crc kubenswrapper[4913]: E1001 12:39:25.806153 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:25 crc kubenswrapper[4913]: E1001 12:39:25.806289 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:26 crc kubenswrapper[4913]: I1001 12:39:26.805918 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:26 crc kubenswrapper[4913]: E1001 12:39:26.806049 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:27 crc kubenswrapper[4913]: I1001 12:39:27.701538 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs\") pod \"network-metrics-daemon-8c8wp\" (UID: \"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\") " pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:27 crc kubenswrapper[4913]: E1001 12:39:27.701820 4913 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:39:27 crc kubenswrapper[4913]: E1001 12:39:27.701923 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs podName:c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18 nodeName:}" failed. No retries permitted until 2025-10-01 12:40:31.701895689 +0000 UTC m=+163.605371377 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs") pod "network-metrics-daemon-8c8wp" (UID: "c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:39:27 crc kubenswrapper[4913]: I1001 12:39:27.806410 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:27 crc kubenswrapper[4913]: I1001 12:39:27.806456 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:27 crc kubenswrapper[4913]: I1001 12:39:27.806419 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:27 crc kubenswrapper[4913]: E1001 12:39:27.806747 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:27 crc kubenswrapper[4913]: E1001 12:39:27.806912 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:27 crc kubenswrapper[4913]: E1001 12:39:27.807079 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:28 crc kubenswrapper[4913]: I1001 12:39:28.831621 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:28 crc kubenswrapper[4913]: E1001 12:39:28.833719 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:28 crc kubenswrapper[4913]: I1001 12:39:28.845448 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=6.845430087 podStartE2EDuration="6.845430087s" podCreationTimestamp="2025-10-01 12:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.845198421 +0000 UTC m=+100.748674049" watchObservedRunningTime="2025-10-01 12:39:28.845430087 +0000 UTC m=+100.748905675" Oct 01 12:39:29 crc kubenswrapper[4913]: I1001 12:39:29.806425 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:29 crc kubenswrapper[4913]: I1001 12:39:29.806425 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:29 crc kubenswrapper[4913]: E1001 12:39:29.806927 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:29 crc kubenswrapper[4913]: I1001 12:39:29.806511 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:29 crc kubenswrapper[4913]: E1001 12:39:29.807015 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:29 crc kubenswrapper[4913]: E1001 12:39:29.807223 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:30 crc kubenswrapper[4913]: I1001 12:39:30.806348 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:30 crc kubenswrapper[4913]: E1001 12:39:30.806516 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:31 crc kubenswrapper[4913]: I1001 12:39:31.806301 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:31 crc kubenswrapper[4913]: I1001 12:39:31.806379 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:31 crc kubenswrapper[4913]: I1001 12:39:31.806302 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:31 crc kubenswrapper[4913]: E1001 12:39:31.806495 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:31 crc kubenswrapper[4913]: E1001 12:39:31.806640 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:31 crc kubenswrapper[4913]: E1001 12:39:31.806791 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:32 crc kubenswrapper[4913]: I1001 12:39:32.805857 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:32 crc kubenswrapper[4913]: E1001 12:39:32.806067 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:33 crc kubenswrapper[4913]: I1001 12:39:33.805665 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:33 crc kubenswrapper[4913]: E1001 12:39:33.805886 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:33 crc kubenswrapper[4913]: I1001 12:39:33.806227 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:33 crc kubenswrapper[4913]: E1001 12:39:33.806375 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:33 crc kubenswrapper[4913]: I1001 12:39:33.806460 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:33 crc kubenswrapper[4913]: E1001 12:39:33.806553 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:34 crc kubenswrapper[4913]: I1001 12:39:34.806396 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:34 crc kubenswrapper[4913]: E1001 12:39:34.806568 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:34 crc kubenswrapper[4913]: I1001 12:39:34.807218 4913 scope.go:117] "RemoveContainer" containerID="090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd" Oct 01 12:39:34 crc kubenswrapper[4913]: E1001 12:39:34.807380 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" Oct 01 12:39:35 crc kubenswrapper[4913]: I1001 12:39:35.806123 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:35 crc kubenswrapper[4913]: I1001 12:39:35.806193 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:35 crc kubenswrapper[4913]: E1001 12:39:35.806232 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:35 crc kubenswrapper[4913]: E1001 12:39:35.806316 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:35 crc kubenswrapper[4913]: I1001 12:39:35.806371 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:35 crc kubenswrapper[4913]: E1001 12:39:35.806589 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:36 crc kubenswrapper[4913]: I1001 12:39:36.806156 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:36 crc kubenswrapper[4913]: E1001 12:39:36.806381 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:37 crc kubenswrapper[4913]: I1001 12:39:37.805869 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:37 crc kubenswrapper[4913]: I1001 12:39:37.805940 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:37 crc kubenswrapper[4913]: E1001 12:39:37.806013 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:37 crc kubenswrapper[4913]: I1001 12:39:37.805964 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:37 crc kubenswrapper[4913]: E1001 12:39:37.806135 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:37 crc kubenswrapper[4913]: E1001 12:39:37.806238 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:38 crc kubenswrapper[4913]: I1001 12:39:38.805871 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:38 crc kubenswrapper[4913]: E1001 12:39:38.806783 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:39 crc kubenswrapper[4913]: I1001 12:39:39.805666 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:39 crc kubenswrapper[4913]: I1001 12:39:39.805742 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:39 crc kubenswrapper[4913]: I1001 12:39:39.805796 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:39 crc kubenswrapper[4913]: E1001 12:39:39.805855 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:39 crc kubenswrapper[4913]: E1001 12:39:39.806031 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:39 crc kubenswrapper[4913]: E1001 12:39:39.806216 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:40 crc kubenswrapper[4913]: I1001 12:39:40.805979 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:40 crc kubenswrapper[4913]: E1001 12:39:40.806106 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:41 crc kubenswrapper[4913]: I1001 12:39:41.805695 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:41 crc kubenswrapper[4913]: I1001 12:39:41.805772 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:41 crc kubenswrapper[4913]: E1001 12:39:41.805879 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:41 crc kubenswrapper[4913]: I1001 12:39:41.805718 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:41 crc kubenswrapper[4913]: E1001 12:39:41.806000 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:41 crc kubenswrapper[4913]: E1001 12:39:41.806186 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:42 crc kubenswrapper[4913]: I1001 12:39:42.806548 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:42 crc kubenswrapper[4913]: E1001 12:39:42.806890 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:43 crc kubenswrapper[4913]: I1001 12:39:43.390410 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqn52_b2420adf-64bd-4d67-ac95-9337ed10149a/kube-multus/1.log" Oct 01 12:39:43 crc kubenswrapper[4913]: I1001 12:39:43.390808 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqn52_b2420adf-64bd-4d67-ac95-9337ed10149a/kube-multus/0.log" Oct 01 12:39:43 crc kubenswrapper[4913]: I1001 12:39:43.390840 4913 generic.go:334] "Generic (PLEG): container finished" podID="b2420adf-64bd-4d67-ac95-9337ed10149a" containerID="4a3180a2137b82f1695cd2f7e650f7d36760bc1fb62e77a131035c38edf8b9a9" exitCode=1 Oct 01 12:39:43 crc kubenswrapper[4913]: I1001 12:39:43.390866 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqn52" event={"ID":"b2420adf-64bd-4d67-ac95-9337ed10149a","Type":"ContainerDied","Data":"4a3180a2137b82f1695cd2f7e650f7d36760bc1fb62e77a131035c38edf8b9a9"} Oct 01 12:39:43 crc kubenswrapper[4913]: I1001 12:39:43.390896 4913 scope.go:117] "RemoveContainer" containerID="5afdedaa9bfaf0ed9befad3fc7a2b637f049c3cba1c44995a25a015b0043a7d0" Oct 01 12:39:43 crc kubenswrapper[4913]: I1001 12:39:43.392810 4913 scope.go:117] "RemoveContainer" containerID="4a3180a2137b82f1695cd2f7e650f7d36760bc1fb62e77a131035c38edf8b9a9" Oct 01 12:39:43 crc kubenswrapper[4913]: E1001 12:39:43.393126 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-zqn52_openshift-multus(b2420adf-64bd-4d67-ac95-9337ed10149a)\"" pod="openshift-multus/multus-zqn52" podUID="b2420adf-64bd-4d67-ac95-9337ed10149a" Oct 01 12:39:43 crc kubenswrapper[4913]: I1001 12:39:43.806192 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:43 crc kubenswrapper[4913]: E1001 12:39:43.806578 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:43 crc kubenswrapper[4913]: I1001 12:39:43.806299 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:43 crc kubenswrapper[4913]: I1001 12:39:43.806197 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:43 crc kubenswrapper[4913]: E1001 12:39:43.806668 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:43 crc kubenswrapper[4913]: E1001 12:39:43.806994 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:44 crc kubenswrapper[4913]: I1001 12:39:44.398380 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqn52_b2420adf-64bd-4d67-ac95-9337ed10149a/kube-multus/1.log" Oct 01 12:39:44 crc kubenswrapper[4913]: I1001 12:39:44.806531 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:44 crc kubenswrapper[4913]: E1001 12:39:44.806724 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:45 crc kubenswrapper[4913]: I1001 12:39:45.806192 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:45 crc kubenswrapper[4913]: I1001 12:39:45.806299 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:45 crc kubenswrapper[4913]: E1001 12:39:45.806378 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:45 crc kubenswrapper[4913]: I1001 12:39:45.806402 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:45 crc kubenswrapper[4913]: E1001 12:39:45.806542 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:45 crc kubenswrapper[4913]: E1001 12:39:45.806616 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:46 crc kubenswrapper[4913]: I1001 12:39:46.805785 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:46 crc kubenswrapper[4913]: E1001 12:39:46.805967 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:47 crc kubenswrapper[4913]: I1001 12:39:47.805844 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:47 crc kubenswrapper[4913]: I1001 12:39:47.805973 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:47 crc kubenswrapper[4913]: I1001 12:39:47.806090 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:47 crc kubenswrapper[4913]: E1001 12:39:47.805996 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:47 crc kubenswrapper[4913]: E1001 12:39:47.806203 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:47 crc kubenswrapper[4913]: E1001 12:39:47.806388 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:48 crc kubenswrapper[4913]: E1001 12:39:48.796349 4913 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 01 12:39:48 crc kubenswrapper[4913]: I1001 12:39:48.806022 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:48 crc kubenswrapper[4913]: E1001 12:39:48.815348 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:48 crc kubenswrapper[4913]: I1001 12:39:48.816753 4913 scope.go:117] "RemoveContainer" containerID="090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd" Oct 01 12:39:48 crc kubenswrapper[4913]: E1001 12:39:48.817092 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-57qvb_openshift-ovn-kubernetes(c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" Oct 01 12:39:48 crc kubenswrapper[4913]: E1001 12:39:48.916258 4913 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 12:39:49 crc kubenswrapper[4913]: I1001 12:39:49.806372 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:49 crc kubenswrapper[4913]: I1001 12:39:49.806446 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:49 crc kubenswrapper[4913]: E1001 12:39:49.806525 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:49 crc kubenswrapper[4913]: E1001 12:39:49.806663 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:49 crc kubenswrapper[4913]: I1001 12:39:49.806467 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:49 crc kubenswrapper[4913]: E1001 12:39:49.806953 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:50 crc kubenswrapper[4913]: I1001 12:39:50.806132 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:50 crc kubenswrapper[4913]: E1001 12:39:50.806306 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:51 crc kubenswrapper[4913]: I1001 12:39:51.806105 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:51 crc kubenswrapper[4913]: I1001 12:39:51.806174 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:51 crc kubenswrapper[4913]: E1001 12:39:51.806403 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:51 crc kubenswrapper[4913]: I1001 12:39:51.806683 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:51 crc kubenswrapper[4913]: E1001 12:39:51.806820 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:51 crc kubenswrapper[4913]: E1001 12:39:51.807708 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:52 crc kubenswrapper[4913]: I1001 12:39:52.806862 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:52 crc kubenswrapper[4913]: E1001 12:39:52.807047 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:53 crc kubenswrapper[4913]: I1001 12:39:53.806222 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:53 crc kubenswrapper[4913]: I1001 12:39:53.806315 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:53 crc kubenswrapper[4913]: E1001 12:39:53.806458 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:53 crc kubenswrapper[4913]: I1001 12:39:53.806620 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:53 crc kubenswrapper[4913]: E1001 12:39:53.806816 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:53 crc kubenswrapper[4913]: E1001 12:39:53.806920 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:53 crc kubenswrapper[4913]: E1001 12:39:53.917781 4913 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 12:39:54 crc kubenswrapper[4913]: I1001 12:39:54.805953 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:54 crc kubenswrapper[4913]: E1001 12:39:54.806136 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:55 crc kubenswrapper[4913]: I1001 12:39:55.805988 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:55 crc kubenswrapper[4913]: I1001 12:39:55.806067 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:55 crc kubenswrapper[4913]: E1001 12:39:55.806114 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:55 crc kubenswrapper[4913]: I1001 12:39:55.806142 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:55 crc kubenswrapper[4913]: E1001 12:39:55.806348 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:55 crc kubenswrapper[4913]: E1001 12:39:55.806442 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:56 crc kubenswrapper[4913]: I1001 12:39:56.806579 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:56 crc kubenswrapper[4913]: E1001 12:39:56.806761 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:57 crc kubenswrapper[4913]: I1001 12:39:57.806101 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:57 crc kubenswrapper[4913]: E1001 12:39:57.806316 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:57 crc kubenswrapper[4913]: I1001 12:39:57.806374 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:57 crc kubenswrapper[4913]: E1001 12:39:57.806775 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:57 crc kubenswrapper[4913]: I1001 12:39:57.806860 4913 scope.go:117] "RemoveContainer" containerID="4a3180a2137b82f1695cd2f7e650f7d36760bc1fb62e77a131035c38edf8b9a9" Oct 01 12:39:57 crc kubenswrapper[4913]: I1001 12:39:57.807407 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:57 crc kubenswrapper[4913]: E1001 12:39:57.807570 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:58 crc kubenswrapper[4913]: I1001 12:39:58.443598 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqn52_b2420adf-64bd-4d67-ac95-9337ed10149a/kube-multus/1.log" Oct 01 12:39:58 crc kubenswrapper[4913]: I1001 12:39:58.443935 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqn52" event={"ID":"b2420adf-64bd-4d67-ac95-9337ed10149a","Type":"ContainerStarted","Data":"993223123ddc31a9f0505ba888d1ff1a8341a371ee37bc495e687819f372d309"} Oct 01 12:39:58 crc kubenswrapper[4913]: I1001 12:39:58.806654 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:58 crc kubenswrapper[4913]: E1001 12:39:58.807496 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:58 crc kubenswrapper[4913]: E1001 12:39:58.920404 4913 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 12:39:59 crc kubenswrapper[4913]: I1001 12:39:59.806122 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:59 crc kubenswrapper[4913]: I1001 12:39:59.806216 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:39:59 crc kubenswrapper[4913]: I1001 12:39:59.806216 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:59 crc kubenswrapper[4913]: E1001 12:39:59.806423 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:59 crc kubenswrapper[4913]: E1001 12:39:59.806540 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:39:59 crc kubenswrapper[4913]: E1001 12:39:59.806678 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:40:00 crc kubenswrapper[4913]: I1001 12:40:00.806755 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:40:00 crc kubenswrapper[4913]: E1001 12:40:00.806946 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:40:00 crc kubenswrapper[4913]: I1001 12:40:00.808180 4913 scope.go:117] "RemoveContainer" containerID="090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd" Oct 01 12:40:01 crc kubenswrapper[4913]: I1001 12:40:01.455046 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/3.log" Oct 01 12:40:01 crc kubenswrapper[4913]: I1001 12:40:01.457163 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerStarted","Data":"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d"} Oct 01 12:40:01 crc kubenswrapper[4913]: I1001 12:40:01.458051 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:40:01 crc kubenswrapper[4913]: I1001 12:40:01.573703 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podStartSLOduration=112.573685355 podStartE2EDuration="1m52.573685355s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:01.488446241 +0000 UTC m=+133.391921839" watchObservedRunningTime="2025-10-01 12:40:01.573685355 +0000 UTC m=+133.477160933" Oct 01 12:40:01 crc kubenswrapper[4913]: I1001 12:40:01.574313 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8c8wp"] Oct 01 12:40:01 crc kubenswrapper[4913]: I1001 12:40:01.574401 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:40:01 crc kubenswrapper[4913]: E1001 12:40:01.574475 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:40:01 crc kubenswrapper[4913]: I1001 12:40:01.805646 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:40:01 crc kubenswrapper[4913]: I1001 12:40:01.805623 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:40:01 crc kubenswrapper[4913]: E1001 12:40:01.805812 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:40:01 crc kubenswrapper[4913]: E1001 12:40:01.805983 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:40:02 crc kubenswrapper[4913]: I1001 12:40:02.806581 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:40:02 crc kubenswrapper[4913]: E1001 12:40:02.806915 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:40:03 crc kubenswrapper[4913]: I1001 12:40:03.806432 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:40:03 crc kubenswrapper[4913]: I1001 12:40:03.806526 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:40:03 crc kubenswrapper[4913]: I1001 12:40:03.806455 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:40:03 crc kubenswrapper[4913]: E1001 12:40:03.806595 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8c8wp" podUID="c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18" Oct 01 12:40:03 crc kubenswrapper[4913]: E1001 12:40:03.806720 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:40:03 crc kubenswrapper[4913]: E1001 12:40:03.806838 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:40:04 crc kubenswrapper[4913]: I1001 12:40:04.806686 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:40:04 crc kubenswrapper[4913]: I1001 12:40:04.809194 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 01 12:40:04 crc kubenswrapper[4913]: I1001 12:40:04.809224 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 01 12:40:05 crc kubenswrapper[4913]: I1001 12:40:05.806250 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:40:05 crc kubenswrapper[4913]: I1001 12:40:05.806343 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:40:05 crc kubenswrapper[4913]: I1001 12:40:05.806386 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:40:05 crc kubenswrapper[4913]: I1001 12:40:05.810017 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 01 12:40:05 crc kubenswrapper[4913]: I1001 12:40:05.810040 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 01 12:40:05 crc kubenswrapper[4913]: I1001 12:40:05.810158 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 01 12:40:05 crc kubenswrapper[4913]: I1001 12:40:05.810314 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.270028 4913 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.341652 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.341994 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.345416 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.348195 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wp7l9"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.348866 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.348902 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.352282 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.352966 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.353559 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.353674 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.354412 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vwnnv"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.354736 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.355257 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.360069 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.360237 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.360437 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.360556 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.360989 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qjc8c"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.361435 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.373882 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.373985 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.375327 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.376256 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.376671 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.376730 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.376673 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.377304 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.377533 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.378093 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.378227 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.378301 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.378472 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.379771 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.379842 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.379930 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.379941 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.379973 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.380029 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.379949 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.380029 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.379985 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.380045 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.380099 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.380339 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.380417 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.380421 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.380575 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.380696 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.385196 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.388309 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.389781 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.390040 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.390954 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.391001 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.391483 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.391499 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.391854 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.403118 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.405975 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.406329 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.406398 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.406648 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.412034 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-97mb9"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.412687 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.413593 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ssczm"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.413949 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ssczm" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.415139 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.415476 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.418188 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.418447 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fm7mq"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.418852 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lhq4g"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.419079 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rswcz"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.419417 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.419943 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.420077 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.420519 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.421203 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.421460 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.421813 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.422245 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jgpzx"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.423394 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgpzx" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.424749 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4fjzp"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.425055 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.426858 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.427208 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.427404 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.427775 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.427927 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.428429 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.428593 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.429584 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.433388 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.433888 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.435138 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.446696 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.448380 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p2n7t"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.448849 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-serving-cert\") pod \"route-controller-manager-6576b87f9c-mmgph\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.448945 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.449024 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.449093 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-client-ca\") pod \"route-controller-manager-6576b87f9c-mmgph\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.449178 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-serving-cert\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.449253 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8txcb\" (UniqueName: \"kubernetes.io/projected/e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3-kube-api-access-8txcb\") pod \"cluster-samples-operator-665b6dd947-5vwx6\" (UID: \"e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.449370 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-image-import-ca\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.449438 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrcmg\" (UniqueName: \"kubernetes.io/projected/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-kube-api-access-xrcmg\") pod \"route-controller-manager-6576b87f9c-mmgph\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.449512 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5vwx6\" (UID: \"e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.449592 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-config\") pod \"route-controller-manager-6576b87f9c-mmgph\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.449667 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.449746 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-config\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.449813 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.449881 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsbsd\" (UniqueName: \"kubernetes.io/projected/f5cde242-eb9a-4376-b458-5054299e53e0-kube-api-access-tsbsd\") pod \"openshift-apiserver-operator-796bbdcf4f-hxvqt\" (UID: \"f5cde242-eb9a-4376-b458-5054299e53e0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.449949 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-audit-policies\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.450191 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mxsw\" (UniqueName: \"kubernetes.io/projected/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-kube-api-access-4mxsw\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.450279 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-node-pullsecrets\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.450357 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb7d7281-96c5-4a7d-b57f-769fbafba858-serving-cert\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.450425 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvzqn\" (UniqueName: \"kubernetes.io/projected/bb7d7281-96c5-4a7d-b57f-769fbafba858-kube-api-access-jvzqn\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.450496 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.450581 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.450661 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.450731 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.450803 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-encryption-config\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.450873 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-audit-dir\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.450942 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-audit\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.451011 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5cde242-eb9a-4376-b458-5054299e53e0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hxvqt\" (UID: \"f5cde242-eb9a-4376-b458-5054299e53e0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.451080 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.451151 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.451225 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-client-ca\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.451316 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.451424 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.449147 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-p2n7t" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.451515 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c349f466-f6f2-44a8-aea1-090f74dd7abe-audit-dir\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.451765 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-config\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.451838 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-etcd-client\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.451901 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.451978 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-etcd-serving-ca\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.452047 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5cde242-eb9a-4376-b458-5054299e53e0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hxvqt\" (UID: \"f5cde242-eb9a-4376-b458-5054299e53e0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.452116 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppwgf\" (UniqueName: \"kubernetes.io/projected/c349f466-f6f2-44a8-aea1-090f74dd7abe-kube-api-access-ppwgf\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.457945 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x4fts"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.458470 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.477222 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.477647 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4fts" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.479834 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.482407 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.491602 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.491903 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.492282 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.492569 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.492965 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.493516 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.493851 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.493979 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.494144 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.495945 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.496415 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.497355 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.497362 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.497516 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.497596 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.497769 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.496501 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.496561 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.496717 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.496752 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.496804 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.496932 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.497828 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.497849 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.497897 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.497913 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.497099 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.497017 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.498048 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.498052 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.497034 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.497037 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.497292 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.498435 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.498540 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.498563 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.496987 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.498881 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vbrpv"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.499427 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.503750 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.508122 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.508455 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.509435 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.509535 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.509807 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.518422 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.519459 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.519665 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.522119 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.522782 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.501902 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.522991 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.523582 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.524360 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.524469 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.524571 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7f575"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.525160 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kssl9"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.525396 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.525653 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kssl9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.534921 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.535623 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.540936 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.541203 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.541263 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.542774 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.543022 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.543745 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.544798 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.545050 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.547253 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.547907 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.548218 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.549542 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jk8wn"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.549875 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkgdg"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.550123 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.551197 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552595 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552657 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-etcd-client\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552677 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552722 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-config\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552737 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-etcd-serving-ca\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552756 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5cde242-eb9a-4376-b458-5054299e53e0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hxvqt\" (UID: \"f5cde242-eb9a-4376-b458-5054299e53e0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552796 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppwgf\" (UniqueName: \"kubernetes.io/projected/c349f466-f6f2-44a8-aea1-090f74dd7abe-kube-api-access-ppwgf\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552813 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-serving-cert\") pod \"route-controller-manager-6576b87f9c-mmgph\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552837 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552878 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552901 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-client-ca\") pod \"route-controller-manager-6576b87f9c-mmgph\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552916 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-serving-cert\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552951 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-image-import-ca\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552968 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrcmg\" (UniqueName: \"kubernetes.io/projected/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-kube-api-access-xrcmg\") pod \"route-controller-manager-6576b87f9c-mmgph\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.552984 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8txcb\" (UniqueName: \"kubernetes.io/projected/e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3-kube-api-access-8txcb\") pod \"cluster-samples-operator-665b6dd947-5vwx6\" (UID: \"e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553000 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5vwx6\" (UID: \"e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553036 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-config\") pod \"route-controller-manager-6576b87f9c-mmgph\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553053 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553071 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-config\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553107 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553131 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsbsd\" (UniqueName: \"kubernetes.io/projected/f5cde242-eb9a-4376-b458-5054299e53e0-kube-api-access-tsbsd\") pod \"openshift-apiserver-operator-796bbdcf4f-hxvqt\" (UID: \"f5cde242-eb9a-4376-b458-5054299e53e0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553150 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-audit-policies\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553193 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mxsw\" (UniqueName: \"kubernetes.io/projected/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-kube-api-access-4mxsw\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553211 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvzqn\" (UniqueName: \"kubernetes.io/projected/bb7d7281-96c5-4a7d-b57f-769fbafba858-kube-api-access-jvzqn\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553226 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553260 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-node-pullsecrets\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553287 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb7d7281-96c5-4a7d-b57f-769fbafba858-serving-cert\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553305 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553342 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553358 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553375 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-encryption-config\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553390 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-audit-dir\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553428 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553444 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-audit\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553459 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5cde242-eb9a-4376-b458-5054299e53e0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hxvqt\" (UID: \"f5cde242-eb9a-4376-b458-5054299e53e0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553472 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553517 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-client-ca\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553535 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553552 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553587 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c349f466-f6f2-44a8-aea1-090f74dd7abe-audit-dir\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553635 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c349f466-f6f2-44a8-aea1-090f74dd7abe-audit-dir\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.553690 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-config\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.554369 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-audit-policies\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.554413 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.554560 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-audit-dir\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.554627 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-node-pullsecrets\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.555184 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.557314 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.557740 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.557863 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-audit\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.558079 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5cde242-eb9a-4376-b458-5054299e53e0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hxvqt\" (UID: \"f5cde242-eb9a-4376-b458-5054299e53e0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.557861 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-etcd-serving-ca\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.559163 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-image-import-ca\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.559505 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-etcd-client\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.559765 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-config\") pod \"route-controller-manager-6576b87f9c-mmgph\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.559779 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.559977 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-encryption-config\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.561033 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-client-ca\") pod \"route-controller-manager-6576b87f9c-mmgph\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.562593 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jrp62"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.563060 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.563255 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.563678 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb7d7281-96c5-4a7d-b57f-769fbafba858-serving-cert\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.563843 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkgdg" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.573607 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.573855 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.574042 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-serving-cert\") pod \"route-controller-manager-6576b87f9c-mmgph\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.574491 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.574793 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.574814 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.574847 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.575093 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.575308 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5vwx6\" (UID: \"e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.575531 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrp62" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.575760 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vwnnv"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.575776 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.575786 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.575797 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.575814 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.575901 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.577756 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x4fts"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.580518 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.583950 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lhq4g"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.589888 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qjc8c"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.590623 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.590742 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-serving-cert\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.590482 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-client-ca\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.591100 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-config\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.591783 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.594816 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.600754 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5cde242-eb9a-4376-b458-5054299e53e0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hxvqt\" (UID: \"f5cde242-eb9a-4376-b458-5054299e53e0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.600949 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.603017 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fm7mq"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.603877 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.606202 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.607470 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.609398 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.610480 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wp7l9"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.612313 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tjv45"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.612902 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tjv45" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.613106 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jgpzx"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.614493 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4fjzp"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.615754 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ssczm"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.616618 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-97mb9"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.617664 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7f575"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.618865 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.619520 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.621167 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rswcz"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.623174 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.623900 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.624506 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.625810 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vbrpv"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.626888 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkgdg"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.628447 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kssl9"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.629741 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jrp62"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.630747 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.632004 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p2n7t"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.633242 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.634229 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tjv45"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.635836 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.637201 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.638742 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-78qn2"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.639961 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-f5nlk"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.640524 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.640655 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-f5nlk" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.640766 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-78qn2"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.641791 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vqbl5"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.642836 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vqbl5" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.642897 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vqbl5"] Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.643786 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.663805 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.683927 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.703847 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.723510 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.743537 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.763792 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.784002 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.804709 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.823461 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.844225 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.863719 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.883512 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.903196 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.924307 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.944129 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.964412 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 01 12:40:07 crc kubenswrapper[4913]: I1001 12:40:07.984252 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.004544 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.038753 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.044533 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.065231 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.085369 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.105599 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.125124 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.144765 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.204020 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.225199 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.245343 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.264876 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.284295 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.303974 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.325100 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.344103 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.364689 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.385096 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.404098 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.435716 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.444015 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.465182 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.484108 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.505317 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.522875 4913 request.go:700] Waited for 1.00079023s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/secrets?fieldSelector=metadata.name%3Dkube-scheduler-operator-serving-cert&limit=500&resourceVersion=0 Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.525166 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.544698 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.563797 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.584159 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.604327 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.624410 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.644223 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.671870 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.684048 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.704397 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.724489 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.743510 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.764422 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.786476 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.804779 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.825046 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.843991 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.865099 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.884475 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.903975 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.924765 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.944063 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.964882 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 01 12:40:08 crc kubenswrapper[4913]: I1001 12:40:08.984944 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.025547 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsbsd\" (UniqueName: \"kubernetes.io/projected/f5cde242-eb9a-4376-b458-5054299e53e0-kube-api-access-tsbsd\") pod \"openshift-apiserver-operator-796bbdcf4f-hxvqt\" (UID: \"f5cde242-eb9a-4376-b458-5054299e53e0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.039154 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mxsw\" (UniqueName: \"kubernetes.io/projected/60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e-kube-api-access-4mxsw\") pod \"apiserver-76f77b778f-wp7l9\" (UID: \"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e\") " pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.062887 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvzqn\" (UniqueName: \"kubernetes.io/projected/bb7d7281-96c5-4a7d-b57f-769fbafba858-kube-api-access-jvzqn\") pod \"controller-manager-879f6c89f-vwnnv\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.077860 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8txcb\" (UniqueName: \"kubernetes.io/projected/e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3-kube-api-access-8txcb\") pod \"cluster-samples-operator-665b6dd947-5vwx6\" (UID: \"e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.098436 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrcmg\" (UniqueName: \"kubernetes.io/projected/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-kube-api-access-xrcmg\") pod \"route-controller-manager-6576b87f9c-mmgph\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.121461 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppwgf\" (UniqueName: \"kubernetes.io/projected/c349f466-f6f2-44a8-aea1-090f74dd7abe-kube-api-access-ppwgf\") pod \"oauth-openshift-558db77b4-qjc8c\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.124102 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.144345 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.160309 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.164901 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.171476 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.184890 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.186060 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.204627 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.209608 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.217897 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.224343 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.230693 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.244585 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.264617 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.284735 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.304257 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.324776 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.346688 4913 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.363935 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.373611 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph"] Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.385170 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.388876 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6"] Oct 01 12:40:09 crc kubenswrapper[4913]: W1001 12:40:09.390149 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb3cf64b_1ff7_4792_94fa_4400e85bc1d6.slice/crio-3324d0a7aec959692c76495f067854eb2e0273416856bc1b53619d452d5cc3b0 WatchSource:0}: Error finding container 3324d0a7aec959692c76495f067854eb2e0273416856bc1b53619d452d5cc3b0: Status 404 returned error can't find the container with id 3324d0a7aec959692c76495f067854eb2e0273416856bc1b53619d452d5cc3b0 Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.404652 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.420978 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wp7l9"] Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.424186 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.443667 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.465223 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.483700 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.504023 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.506874 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" event={"ID":"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e","Type":"ContainerStarted","Data":"806cf4c9f72a78e040e33bf790c1a447e4dc5df98a3d4641000cc02ffd47f903"} Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.508649 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6" event={"ID":"e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3","Type":"ContainerStarted","Data":"b0663d80355cf119154319a241f7e5ddeca0066bcad22001b893517628c357b3"} Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.510447 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" event={"ID":"db3cf64b-1ff7-4792-94fa-4400e85bc1d6","Type":"ContainerStarted","Data":"6de81adf1d32a551c07256d9bb6afc1dfe17c2edc6aea8303f53f43fbc2786a6"} Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.510496 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" event={"ID":"db3cf64b-1ff7-4792-94fa-4400e85bc1d6","Type":"ContainerStarted","Data":"3324d0a7aec959692c76495f067854eb2e0273416856bc1b53619d452d5cc3b0"} Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.511409 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.515178 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qjc8c"] Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.516530 4913 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mmgph container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.516598 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" podUID="db3cf64b-1ff7-4792-94fa-4400e85bc1d6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 01 12:40:09 crc kubenswrapper[4913]: W1001 12:40:09.523133 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc349f466_f6f2_44a8_aea1_090f74dd7abe.slice/crio-d0cb582dd46430ff2da43d850435f9e5dbe11801279118e90c12c5f24c79f5c3 WatchSource:0}: Error finding container d0cb582dd46430ff2da43d850435f9e5dbe11801279118e90c12c5f24c79f5c3: Status 404 returned error can't find the container with id d0cb582dd46430ff2da43d850435f9e5dbe11801279118e90c12c5f24c79f5c3 Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.542647 4913 request.go:700] Waited for 1.35511107s due to client-side throttling, not priority and fairness, request: PATCH:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-zcf4q/status Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574048 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-config\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574100 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6fd5584-6878-4be8-83bb-f61003df2639-config-volume\") pod \"collect-profiles-29322030-jlrxc\" (UID: \"e6fd5584-6878-4be8-83bb-f61003df2639\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574126 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhmvq\" (UniqueName: \"kubernetes.io/projected/eb64fb3e-2ed2-469a-8278-13be858098a1-kube-api-access-lhmvq\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574156 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8c97\" (UniqueName: \"kubernetes.io/projected/92b2bff6-3b61-4d3b-8d88-9077b02ed990-kube-api-access-g8c97\") pod \"machine-api-operator-5694c8668f-fm7mq\" (UID: \"92b2bff6-3b61-4d3b-8d88-9077b02ed990\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574177 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-etcd-ca\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574193 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dff82d10-6428-408e-be1f-15df477faac8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5fqgh\" (UID: \"dff82d10-6428-408e-be1f-15df477faac8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574210 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xddzm\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-kube-api-access-xddzm\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574237 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3cfbc937-568e-42d3-9d14-6dec309b3eed-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-x5dnm\" (UID: \"3cfbc937-568e-42d3-9d14-6dec309b3eed\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574256 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edb0a98c-923f-42cf-9f95-fbb4eabc8c73-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sg85m\" (UID: \"edb0a98c-923f-42cf-9f95-fbb4eabc8c73\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574287 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/088f5c34-691a-4adb-95f8-46052ba7241a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nxmpm\" (UID: \"088f5c34-691a-4adb-95f8-46052ba7241a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574306 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95688848-6ae6-4abe-a83a-0b43899c2b81-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mvbbx\" (UID: \"95688848-6ae6-4abe-a83a-0b43899c2b81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574322 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp965\" (UniqueName: \"kubernetes.io/projected/4d2bd20a-3d8d-4073-aca4-ceca547c186f-kube-api-access-gp965\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574340 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd1e4468-40ba-4a47-8f89-99de7fec4071-registry-certificates\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574360 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/619c15e7-db7d-4cbb-af79-5e52468bfc1a-auth-proxy-config\") pod \"machine-approver-56656f9798-m4r4x\" (UID: \"619c15e7-db7d-4cbb-af79-5e52468bfc1a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574381 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-oauth-config\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574415 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb64fb3e-2ed2-469a-8278-13be858098a1-config\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574432 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-serving-cert\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574455 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-oauth-serving-cert\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574473 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92b2bff6-3b61-4d3b-8d88-9077b02ed990-config\") pod \"machine-api-operator-5694c8668f-fm7mq\" (UID: \"92b2bff6-3b61-4d3b-8d88-9077b02ed990\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574488 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv98j\" (UniqueName: \"kubernetes.io/projected/edb0a98c-923f-42cf-9f95-fbb4eabc8c73-kube-api-access-pv98j\") pod \"machine-config-controller-84d6567774-sg85m\" (UID: \"edb0a98c-923f-42cf-9f95-fbb4eabc8c73\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574504 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krkxs\" (UniqueName: \"kubernetes.io/projected/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-kube-api-access-krkxs\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574522 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9639b151-8302-489d-bfc4-dc8d2e371363-audit-policies\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574538 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-etcd-client\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574554 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6frqc\" (UniqueName: \"kubernetes.io/projected/3cfbc937-568e-42d3-9d14-6dec309b3eed-kube-api-access-6frqc\") pod \"cluster-image-registry-operator-dc59b4c8b-x5dnm\" (UID: \"3cfbc937-568e-42d3-9d14-6dec309b3eed\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574572 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9639b151-8302-489d-bfc4-dc8d2e371363-serving-cert\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574598 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb64fb3e-2ed2-469a-8278-13be858098a1-serving-cert\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574621 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dff82d10-6428-408e-be1f-15df477faac8-metrics-tls\") pod \"ingress-operator-5b745b69d9-5fqgh\" (UID: \"dff82d10-6428-408e-be1f-15df477faac8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574641 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a38ca0f1-cbc5-42ab-b174-a7affc65a898-serving-cert\") pod \"service-ca-operator-777779d784-x4fts\" (UID: \"a38ca0f1-cbc5-42ab-b174-a7affc65a898\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4fts" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574675 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/92b2bff6-3b61-4d3b-8d88-9077b02ed990-images\") pod \"machine-api-operator-5694c8668f-fm7mq\" (UID: \"92b2bff6-3b61-4d3b-8d88-9077b02ed990\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574744 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/92b2bff6-3b61-4d3b-8d88-9077b02ed990-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fm7mq\" (UID: \"92b2bff6-3b61-4d3b-8d88-9077b02ed990\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574790 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txk79\" (UniqueName: \"kubernetes.io/projected/dff82d10-6428-408e-be1f-15df477faac8-kube-api-access-txk79\") pod \"ingress-operator-5b745b69d9-5fqgh\" (UID: \"dff82d10-6428-408e-be1f-15df477faac8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574861 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26wmr\" (UniqueName: \"kubernetes.io/projected/a38ca0f1-cbc5-42ab-b174-a7affc65a898-kube-api-access-26wmr\") pod \"service-ca-operator-777779d784-x4fts\" (UID: \"a38ca0f1-cbc5-42ab-b174-a7affc65a898\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4fts" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574891 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6fd5584-6878-4be8-83bb-f61003df2639-secret-volume\") pod \"collect-profiles-29322030-jlrxc\" (UID: \"e6fd5584-6878-4be8-83bb-f61003df2639\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.574975 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd1e4468-40ba-4a47-8f89-99de7fec4071-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575037 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9639b151-8302-489d-bfc4-dc8d2e371363-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575065 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh24b\" (UniqueName: \"kubernetes.io/projected/9639b151-8302-489d-bfc4-dc8d2e371363-kube-api-access-mh24b\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575084 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619c15e7-db7d-4cbb-af79-5e52468bfc1a-config\") pod \"machine-approver-56656f9798-m4r4x\" (UID: \"619c15e7-db7d-4cbb-af79-5e52468bfc1a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575106 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-config\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575128 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95688848-6ae6-4abe-a83a-0b43899c2b81-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mvbbx\" (UID: \"95688848-6ae6-4abe-a83a-0b43899c2b81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575152 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26rbc\" (UniqueName: \"kubernetes.io/projected/088f5c34-691a-4adb-95f8-46052ba7241a-kube-api-access-26rbc\") pod \"openshift-config-operator-7777fb866f-nxmpm\" (UID: \"088f5c34-691a-4adb-95f8-46052ba7241a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575198 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cfbc937-568e-42d3-9d14-6dec309b3eed-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-x5dnm\" (UID: \"3cfbc937-568e-42d3-9d14-6dec309b3eed\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575253 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2-config\") pod \"kube-apiserver-operator-766d6c64bb-5gpp5\" (UID: \"ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575298 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5gpp5\" (UID: \"ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575358 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575383 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-serving-cert\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575428 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-bound-sa-token\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575513 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c9a78b-be05-4c57-b51a-11da91ed2503-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zcf4q\" (UID: \"33c9a78b-be05-4c57-b51a-11da91ed2503\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575549 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb64fb3e-2ed2-469a-8278-13be858098a1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575581 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlxkh\" (UniqueName: \"kubernetes.io/projected/e6fd5584-6878-4be8-83bb-f61003df2639-kube-api-access-tlxkh\") pod \"collect-profiles-29322030-jlrxc\" (UID: \"e6fd5584-6878-4be8-83bb-f61003df2639\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575623 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edb0a98c-923f-42cf-9f95-fbb4eabc8c73-proxy-tls\") pod \"machine-config-controller-84d6567774-sg85m\" (UID: \"edb0a98c-923f-42cf-9f95-fbb4eabc8c73\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" Oct 01 12:40:09 crc kubenswrapper[4913]: E1001 12:40:09.575713 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:10.075700104 +0000 UTC m=+141.979175672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.575920 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9639b151-8302-489d-bfc4-dc8d2e371363-audit-dir\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.576071 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-etcd-service-ca\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.576126 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9cc74c00-b3a6-422c-8ad4-1547e0834e3f-srv-cert\") pod \"catalog-operator-68c6474976-5twnx\" (UID: \"9cc74c00-b3a6-422c-8ad4-1547e0834e3f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.576227 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9639b151-8302-489d-bfc4-dc8d2e371363-encryption-config\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.576338 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt6b6\" (UniqueName: \"kubernetes.io/projected/33c9a78b-be05-4c57-b51a-11da91ed2503-kube-api-access-nt6b6\") pod \"openshift-controller-manager-operator-756b6f6bc6-zcf4q\" (UID: \"33c9a78b-be05-4c57-b51a-11da91ed2503\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.576372 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3cfbc937-568e-42d3-9d14-6dec309b3eed-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-x5dnm\" (UID: \"3cfbc937-568e-42d3-9d14-6dec309b3eed\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.576393 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9cc74c00-b3a6-422c-8ad4-1547e0834e3f-profile-collector-cert\") pod \"catalog-operator-68c6474976-5twnx\" (UID: \"9cc74c00-b3a6-422c-8ad4-1547e0834e3f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.576425 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9639b151-8302-489d-bfc4-dc8d2e371363-etcd-client\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.576464 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-registry-tls\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.576523 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-service-ca\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.576544 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a38ca0f1-cbc5-42ab-b174-a7affc65a898-config\") pod \"service-ca-operator-777779d784-x4fts\" (UID: \"a38ca0f1-cbc5-42ab-b174-a7affc65a898\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4fts" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.576691 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5gpp5\" (UID: \"ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.576734 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd1e4468-40ba-4a47-8f89-99de7fec4071-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.577206 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/619c15e7-db7d-4cbb-af79-5e52468bfc1a-machine-approver-tls\") pod \"machine-approver-56656f9798-m4r4x\" (UID: \"619c15e7-db7d-4cbb-af79-5e52468bfc1a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.577290 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhg5r\" (UniqueName: \"kubernetes.io/projected/95688848-6ae6-4abe-a83a-0b43899c2b81-kube-api-access-jhg5r\") pod \"kube-storage-version-migrator-operator-b67b599dd-mvbbx\" (UID: \"95688848-6ae6-4abe-a83a-0b43899c2b81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.577370 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/088f5c34-691a-4adb-95f8-46052ba7241a-serving-cert\") pod \"openshift-config-operator-7777fb866f-nxmpm\" (UID: \"088f5c34-691a-4adb-95f8-46052ba7241a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.577397 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dff82d10-6428-408e-be1f-15df477faac8-trusted-ca\") pod \"ingress-operator-5b745b69d9-5fqgh\" (UID: \"dff82d10-6428-408e-be1f-15df477faac8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.577644 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42v22\" (UniqueName: \"kubernetes.io/projected/ade544d3-8b0e-43f3-b2c8-cbebd21f0405-kube-api-access-42v22\") pod \"dns-operator-744455d44c-p2n7t\" (UID: \"ade544d3-8b0e-43f3-b2c8-cbebd21f0405\") " pod="openshift-dns-operator/dns-operator-744455d44c-p2n7t" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.577874 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvzfg\" (UniqueName: \"kubernetes.io/projected/36e61ca6-b668-430c-9e68-00a308d192f0-kube-api-access-wvzfg\") pod \"migrator-59844c95c7-jgpzx\" (UID: \"36e61ca6-b668-430c-9e68-00a308d192f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgpzx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.577924 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbv5w\" (UniqueName: \"kubernetes.io/projected/9cc74c00-b3a6-422c-8ad4-1547e0834e3f-kube-api-access-wbv5w\") pod \"catalog-operator-68c6474976-5twnx\" (UID: \"9cc74c00-b3a6-422c-8ad4-1547e0834e3f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.577950 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33c9a78b-be05-4c57-b51a-11da91ed2503-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zcf4q\" (UID: \"33c9a78b-be05-4c57-b51a-11da91ed2503\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.577980 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2dz5\" (UniqueName: \"kubernetes.io/projected/a60b32b5-e9f3-4dd1-be69-05d4ec0789fe-kube-api-access-p2dz5\") pod \"downloads-7954f5f757-ssczm\" (UID: \"a60b32b5-e9f3-4dd1-be69-05d4ec0789fe\") " pod="openshift-console/downloads-7954f5f757-ssczm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.578005 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb64fb3e-2ed2-469a-8278-13be858098a1-service-ca-bundle\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.578030 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd1e4468-40ba-4a47-8f89-99de7fec4071-trusted-ca\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.578054 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ade544d3-8b0e-43f3-b2c8-cbebd21f0405-metrics-tls\") pod \"dns-operator-744455d44c-p2n7t\" (UID: \"ade544d3-8b0e-43f3-b2c8-cbebd21f0405\") " pod="openshift-dns-operator/dns-operator-744455d44c-p2n7t" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.578075 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9639b151-8302-489d-bfc4-dc8d2e371363-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.578119 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls658\" (UniqueName: \"kubernetes.io/projected/619c15e7-db7d-4cbb-af79-5e52468bfc1a-kube-api-access-ls658\") pod \"machine-approver-56656f9798-m4r4x\" (UID: \"619c15e7-db7d-4cbb-af79-5e52468bfc1a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.578144 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-trusted-ca-bundle\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.663837 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt"] Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.673295 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vwnnv"] Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.679135 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:09 crc kubenswrapper[4913]: E1001 12:40:09.679634 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:10.179606445 +0000 UTC m=+142.083082043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.679761 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3cfbc937-568e-42d3-9d14-6dec309b3eed-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-x5dnm\" (UID: \"3cfbc937-568e-42d3-9d14-6dec309b3eed\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.679844 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl6l6\" (UniqueName: \"kubernetes.io/projected/9689e1f9-5b48-47da-af2f-dc1db858196d-kube-api-access-fl6l6\") pod \"control-plane-machine-set-operator-78cbb6b69f-bkgdg\" (UID: \"9689e1f9-5b48-47da-af2f-dc1db858196d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkgdg" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.679877 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3-signing-cabundle\") pod \"service-ca-9c57cc56f-kssl9\" (UID: \"8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3\") " pod="openshift-service-ca/service-ca-9c57cc56f-kssl9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.679905 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48dc2da8-f187-4dde-8dd3-51f29b49c80a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vf784\" (UID: \"48dc2da8-f187-4dde-8dd3-51f29b49c80a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.679925 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/088bf733-26d8-4478-b6d8-346657f863ac-metrics-certs\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.679945 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/088bf733-26d8-4478-b6d8-346657f863ac-service-ca-bundle\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.679967 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edb0a98c-923f-42cf-9f95-fbb4eabc8c73-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sg85m\" (UID: \"edb0a98c-923f-42cf-9f95-fbb4eabc8c73\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.679987 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/088f5c34-691a-4adb-95f8-46052ba7241a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nxmpm\" (UID: \"088f5c34-691a-4adb-95f8-46052ba7241a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680026 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95688848-6ae6-4abe-a83a-0b43899c2b81-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mvbbx\" (UID: \"95688848-6ae6-4abe-a83a-0b43899c2b81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680066 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b05c3501-40da-4059-b7c7-5e3a7bd66ab1-node-bootstrap-token\") pod \"machine-config-server-f5nlk\" (UID: \"b05c3501-40da-4059-b7c7-5e3a7bd66ab1\") " pod="openshift-machine-config-operator/machine-config-server-f5nlk" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680090 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd1e4468-40ba-4a47-8f89-99de7fec4071-registry-certificates\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680112 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp965\" (UniqueName: \"kubernetes.io/projected/4d2bd20a-3d8d-4073-aca4-ceca547c186f-kube-api-access-gp965\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680140 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/619c15e7-db7d-4cbb-af79-5e52468bfc1a-auth-proxy-config\") pod \"machine-approver-56656f9798-m4r4x\" (UID: \"619c15e7-db7d-4cbb-af79-5e52468bfc1a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680159 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-oauth-config\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680181 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-mountpoint-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680203 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9664e6b0-4a34-483f-b219-9c7e7cc6d37b-config\") pod \"kube-controller-manager-operator-78b949d7b-z4xlz\" (UID: \"9664e6b0-4a34-483f-b219-9c7e7cc6d37b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680223 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-stkv5\" (UID: \"7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680283 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb64fb3e-2ed2-469a-8278-13be858098a1-config\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680302 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-serving-cert\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680325 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjgb6\" (UniqueName: \"kubernetes.io/projected/3abdc8ed-965f-4219-9a7c-f18b65448445-kube-api-access-qjgb6\") pod \"console-operator-58897d9998-7f575\" (UID: \"3abdc8ed-965f-4219-9a7c-f18b65448445\") " pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680372 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-oauth-serving-cert\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680393 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92b2bff6-3b61-4d3b-8d88-9077b02ed990-config\") pod \"machine-api-operator-5694c8668f-fm7mq\" (UID: \"92b2bff6-3b61-4d3b-8d88-9077b02ed990\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680412 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48dc2da8-f187-4dde-8dd3-51f29b49c80a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vf784\" (UID: \"48dc2da8-f187-4dde-8dd3-51f29b49c80a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680432 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61fd05a4-3c42-4fe4-8bce-486b538bab4c-metrics-tls\") pod \"dns-default-vqbl5\" (UID: \"61fd05a4-3c42-4fe4-8bce-486b538bab4c\") " pod="openshift-dns/dns-default-vqbl5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680451 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv98j\" (UniqueName: \"kubernetes.io/projected/edb0a98c-923f-42cf-9f95-fbb4eabc8c73-kube-api-access-pv98j\") pod \"machine-config-controller-84d6567774-sg85m\" (UID: \"edb0a98c-923f-42cf-9f95-fbb4eabc8c73\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680472 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krkxs\" (UniqueName: \"kubernetes.io/projected/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-kube-api-access-krkxs\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680492 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abdc8ed-965f-4219-9a7c-f18b65448445-serving-cert\") pod \"console-operator-58897d9998-7f575\" (UID: \"3abdc8ed-965f-4219-9a7c-f18b65448445\") " pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680515 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knw27\" (UniqueName: \"kubernetes.io/projected/1b35345f-132f-4890-a37d-e0dab7f975b7-kube-api-access-knw27\") pod \"packageserver-d55dfcdfc-r5r2z\" (UID: \"1b35345f-132f-4890-a37d-e0dab7f975b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680540 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9639b151-8302-489d-bfc4-dc8d2e371363-audit-policies\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680560 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-etcd-client\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680581 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6frqc\" (UniqueName: \"kubernetes.io/projected/3cfbc937-568e-42d3-9d14-6dec309b3eed-kube-api-access-6frqc\") pod \"cluster-image-registry-operator-dc59b4c8b-x5dnm\" (UID: \"3cfbc937-568e-42d3-9d14-6dec309b3eed\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680602 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbmxl\" (UniqueName: \"kubernetes.io/projected/dbbcc58c-99fe-4c00-bc81-64d399027e66-kube-api-access-cbmxl\") pod \"multus-admission-controller-857f4d67dd-jrp62\" (UID: \"dbbcc58c-99fe-4c00-bc81-64d399027e66\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrp62" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680623 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/088bf733-26d8-4478-b6d8-346657f863ac-default-certificate\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680647 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9639b151-8302-489d-bfc4-dc8d2e371363-serving-cert\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680673 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb64fb3e-2ed2-469a-8278-13be858098a1-serving-cert\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680701 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3542249d-1f2f-4814-8d0b-ba8b664f48d7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g7d59\" (UID: \"3542249d-1f2f-4814-8d0b-ba8b664f48d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680723 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-socket-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680750 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1b35345f-132f-4890-a37d-e0dab7f975b7-apiservice-cert\") pod \"packageserver-d55dfcdfc-r5r2z\" (UID: \"1b35345f-132f-4890-a37d-e0dab7f975b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680783 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dff82d10-6428-408e-be1f-15df477faac8-metrics-tls\") pod \"ingress-operator-5b745b69d9-5fqgh\" (UID: \"dff82d10-6428-408e-be1f-15df477faac8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680809 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/92b2bff6-3b61-4d3b-8d88-9077b02ed990-images\") pod \"machine-api-operator-5694c8668f-fm7mq\" (UID: \"92b2bff6-3b61-4d3b-8d88-9077b02ed990\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680832 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/92b2bff6-3b61-4d3b-8d88-9077b02ed990-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fm7mq\" (UID: \"92b2bff6-3b61-4d3b-8d88-9077b02ed990\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680855 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a38ca0f1-cbc5-42ab-b174-a7affc65a898-serving-cert\") pod \"service-ca-operator-777779d784-x4fts\" (UID: \"a38ca0f1-cbc5-42ab-b174-a7affc65a898\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4fts" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680877 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c6808721-c324-44b0-af75-7d438cc0d713-images\") pod \"machine-config-operator-74547568cd-qxwbr\" (UID: \"c6808721-c324-44b0-af75-7d438cc0d713\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680903 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txk79\" (UniqueName: \"kubernetes.io/projected/dff82d10-6428-408e-be1f-15df477faac8-kube-api-access-txk79\") pod \"ingress-operator-5b745b69d9-5fqgh\" (UID: \"dff82d10-6428-408e-be1f-15df477faac8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680922 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26wmr\" (UniqueName: \"kubernetes.io/projected/a38ca0f1-cbc5-42ab-b174-a7affc65a898-kube-api-access-26wmr\") pod \"service-ca-operator-777779d784-x4fts\" (UID: \"a38ca0f1-cbc5-42ab-b174-a7affc65a898\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4fts" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680946 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edb0a98c-923f-42cf-9f95-fbb4eabc8c73-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sg85m\" (UID: \"edb0a98c-923f-42cf-9f95-fbb4eabc8c73\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680967 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/088f5c34-691a-4adb-95f8-46052ba7241a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nxmpm\" (UID: \"088f5c34-691a-4adb-95f8-46052ba7241a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.680950 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6fd5584-6878-4be8-83bb-f61003df2639-secret-volume\") pod \"collect-profiles-29322030-jlrxc\" (UID: \"e6fd5584-6878-4be8-83bb-f61003df2639\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681057 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6808721-c324-44b0-af75-7d438cc0d713-proxy-tls\") pod \"machine-config-operator-74547568cd-qxwbr\" (UID: \"c6808721-c324-44b0-af75-7d438cc0d713\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681091 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ceea773f-549c-4d23-841c-a8e2ccb62f28-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vbrpv\" (UID: \"ceea773f-549c-4d23-841c-a8e2ccb62f28\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681130 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd1e4468-40ba-4a47-8f89-99de7fec4071-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681156 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3-signing-key\") pod \"service-ca-9c57cc56f-kssl9\" (UID: \"8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3\") " pod="openshift-service-ca/service-ca-9c57cc56f-kssl9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681184 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b05c3501-40da-4059-b7c7-5e3a7bd66ab1-certs\") pod \"machine-config-server-f5nlk\" (UID: \"b05c3501-40da-4059-b7c7-5e3a7bd66ab1\") " pod="openshift-machine-config-operator/machine-config-server-f5nlk" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681209 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/61fd05a4-3c42-4fe4-8bce-486b538bab4c-config-volume\") pod \"dns-default-vqbl5\" (UID: \"61fd05a4-3c42-4fe4-8bce-486b538bab4c\") " pod="openshift-dns/dns-default-vqbl5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681300 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9639b151-8302-489d-bfc4-dc8d2e371363-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681695 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh24b\" (UniqueName: \"kubernetes.io/projected/9639b151-8302-489d-bfc4-dc8d2e371363-kube-api-access-mh24b\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681750 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619c15e7-db7d-4cbb-af79-5e52468bfc1a-config\") pod \"machine-approver-56656f9798-m4r4x\" (UID: \"619c15e7-db7d-4cbb-af79-5e52468bfc1a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681773 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-config\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681794 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95688848-6ae6-4abe-a83a-0b43899c2b81-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mvbbx\" (UID: \"95688848-6ae6-4abe-a83a-0b43899c2b81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681821 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26rbc\" (UniqueName: \"kubernetes.io/projected/088f5c34-691a-4adb-95f8-46052ba7241a-kube-api-access-26rbc\") pod \"openshift-config-operator-7777fb866f-nxmpm\" (UID: \"088f5c34-691a-4adb-95f8-46052ba7241a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681846 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cfbc937-568e-42d3-9d14-6dec309b3eed-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-x5dnm\" (UID: \"3cfbc937-568e-42d3-9d14-6dec309b3eed\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681866 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2-config\") pod \"kube-apiserver-operator-766d6c64bb-5gpp5\" (UID: \"ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681883 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5gpp5\" (UID: \"ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681906 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1b35345f-132f-4890-a37d-e0dab7f975b7-tmpfs\") pod \"packageserver-d55dfcdfc-r5r2z\" (UID: \"1b35345f-132f-4890-a37d-e0dab7f975b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681936 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681956 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-serving-cert\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.681973 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-plugins-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.682689 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-oauth-serving-cert\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.682912 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92b2bff6-3b61-4d3b-8d88-9077b02ed990-config\") pod \"machine-api-operator-5694c8668f-fm7mq\" (UID: \"92b2bff6-3b61-4d3b-8d88-9077b02ed990\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.683412 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619c15e7-db7d-4cbb-af79-5e52468bfc1a-config\") pod \"machine-approver-56656f9798-m4r4x\" (UID: \"619c15e7-db7d-4cbb-af79-5e52468bfc1a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.683511 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb64fb3e-2ed2-469a-8278-13be858098a1-config\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.683987 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-config\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.684165 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95688848-6ae6-4abe-a83a-0b43899c2b81-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mvbbx\" (UID: \"95688848-6ae6-4abe-a83a-0b43899c2b81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx" Oct 01 12:40:09 crc kubenswrapper[4913]: E1001 12:40:09.684278 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:10.184251062 +0000 UTC m=+142.087726640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.685357 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb64fb3e-2ed2-469a-8278-13be858098a1-serving-cert\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.685426 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd1e4468-40ba-4a47-8f89-99de7fec4071-registry-certificates\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.685478 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cfbc937-568e-42d3-9d14-6dec309b3eed-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-x5dnm\" (UID: \"3cfbc937-568e-42d3-9d14-6dec309b3eed\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.685462 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-serving-cert\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.685925 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9639b151-8302-489d-bfc4-dc8d2e371363-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.686216 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6fd5584-6878-4be8-83bb-f61003df2639-secret-volume\") pod \"collect-profiles-29322030-jlrxc\" (UID: \"e6fd5584-6878-4be8-83bb-f61003df2639\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.686398 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9639b151-8302-489d-bfc4-dc8d2e371363-audit-policies\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.686533 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-bound-sa-token\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.686568 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c9a78b-be05-4c57-b51a-11da91ed2503-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zcf4q\" (UID: \"33c9a78b-be05-4c57-b51a-11da91ed2503\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.686600 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb64fb3e-2ed2-469a-8278-13be858098a1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.686647 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3542249d-1f2f-4814-8d0b-ba8b664f48d7-srv-cert\") pod \"olm-operator-6b444d44fb-g7d59\" (UID: \"3542249d-1f2f-4814-8d0b-ba8b664f48d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.686727 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd1e4468-40ba-4a47-8f89-99de7fec4071-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.686865 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-serving-cert\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.686967 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2-config\") pod \"kube-apiserver-operator-766d6c64bb-5gpp5\" (UID: \"ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.687120 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/92b2bff6-3b61-4d3b-8d88-9077b02ed990-images\") pod \"machine-api-operator-5694c8668f-fm7mq\" (UID: \"92b2bff6-3b61-4d3b-8d88-9077b02ed990\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.685387 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3cfbc937-568e-42d3-9d14-6dec309b3eed-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-x5dnm\" (UID: \"3cfbc937-568e-42d3-9d14-6dec309b3eed\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.687241 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/619c15e7-db7d-4cbb-af79-5e52468bfc1a-auth-proxy-config\") pod \"machine-approver-56656f9798-m4r4x\" (UID: \"619c15e7-db7d-4cbb-af79-5e52468bfc1a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.687360 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlxkh\" (UniqueName: \"kubernetes.io/projected/e6fd5584-6878-4be8-83bb-f61003df2639-kube-api-access-tlxkh\") pod \"collect-profiles-29322030-jlrxc\" (UID: \"e6fd5584-6878-4be8-83bb-f61003df2639\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.687580 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edb0a98c-923f-42cf-9f95-fbb4eabc8c73-proxy-tls\") pod \"machine-config-controller-84d6567774-sg85m\" (UID: \"edb0a98c-923f-42cf-9f95-fbb4eabc8c73\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.687646 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9664e6b0-4a34-483f-b219-9c7e7cc6d37b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-z4xlz\" (UID: \"9664e6b0-4a34-483f-b219-9c7e7cc6d37b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688018 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb64fb3e-2ed2-469a-8278-13be858098a1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688033 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9639b151-8302-489d-bfc4-dc8d2e371363-audit-dir\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688076 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ceea773f-549c-4d23-841c-a8e2ccb62f28-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vbrpv\" (UID: \"ceea773f-549c-4d23-841c-a8e2ccb62f28\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688101 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9639b151-8302-489d-bfc4-dc8d2e371363-audit-dir\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688189 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-etcd-service-ca\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688229 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d48f7ca-f286-49f9-8541-bc186e440dfa-cert\") pod \"ingress-canary-tjv45\" (UID: \"4d48f7ca-f286-49f9-8541-bc186e440dfa\") " pod="openshift-ingress-canary/ingress-canary-tjv45" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688280 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb2hf\" (UniqueName: \"kubernetes.io/projected/4d48f7ca-f286-49f9-8541-bc186e440dfa-kube-api-access-zb2hf\") pod \"ingress-canary-tjv45\" (UID: \"4d48f7ca-f286-49f9-8541-bc186e440dfa\") " pod="openshift-ingress-canary/ingress-canary-tjv45" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688475 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c9a78b-be05-4c57-b51a-11da91ed2503-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zcf4q\" (UID: \"33c9a78b-be05-4c57-b51a-11da91ed2503\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688514 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a38ca0f1-cbc5-42ab-b174-a7affc65a898-serving-cert\") pod \"service-ca-operator-777779d784-x4fts\" (UID: \"a38ca0f1-cbc5-42ab-b174-a7affc65a898\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4fts" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688552 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dbbcc58c-99fe-4c00-bc81-64d399027e66-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jrp62\" (UID: \"dbbcc58c-99fe-4c00-bc81-64d399027e66\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrp62" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688589 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9cc74c00-b3a6-422c-8ad4-1547e0834e3f-srv-cert\") pod \"catalog-operator-68c6474976-5twnx\" (UID: \"9cc74c00-b3a6-422c-8ad4-1547e0834e3f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688632 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/088bf733-26d8-4478-b6d8-346657f863ac-stats-auth\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688667 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9639b151-8302-489d-bfc4-dc8d2e371363-encryption-config\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688893 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt6b6\" (UniqueName: \"kubernetes.io/projected/33c9a78b-be05-4c57-b51a-11da91ed2503-kube-api-access-nt6b6\") pod \"openshift-controller-manager-operator-756b6f6bc6-zcf4q\" (UID: \"33c9a78b-be05-4c57-b51a-11da91ed2503\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.688937 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s75h\" (UniqueName: \"kubernetes.io/projected/61fd05a4-3c42-4fe4-8bce-486b538bab4c-kube-api-access-7s75h\") pod \"dns-default-vqbl5\" (UID: \"61fd05a4-3c42-4fe4-8bce-486b538bab4c\") " pod="openshift-dns/dns-default-vqbl5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.689018 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x9zt\" (UniqueName: \"kubernetes.io/projected/c6808721-c324-44b0-af75-7d438cc0d713-kube-api-access-8x9zt\") pod \"machine-config-operator-74547568cd-qxwbr\" (UID: \"c6808721-c324-44b0-af75-7d438cc0d713\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.689054 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sjbc\" (UniqueName: \"kubernetes.io/projected/088bf733-26d8-4478-b6d8-346657f863ac-kube-api-access-9sjbc\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.689076 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nwks\" (UniqueName: \"kubernetes.io/projected/8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3-kube-api-access-8nwks\") pod \"service-ca-9c57cc56f-kssl9\" (UID: \"8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3\") " pod="openshift-service-ca/service-ca-9c57cc56f-kssl9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.689108 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9cc74c00-b3a6-422c-8ad4-1547e0834e3f-profile-collector-cert\") pod \"catalog-operator-68c6474976-5twnx\" (UID: \"9cc74c00-b3a6-422c-8ad4-1547e0834e3f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.689128 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3cfbc937-568e-42d3-9d14-6dec309b3eed-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-x5dnm\" (UID: \"3cfbc937-568e-42d3-9d14-6dec309b3eed\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.689150 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48dc2da8-f187-4dde-8dd3-51f29b49c80a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vf784\" (UID: \"48dc2da8-f187-4dde-8dd3-51f29b49c80a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.689216 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9639b151-8302-489d-bfc4-dc8d2e371363-etcd-client\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.689253 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-etcd-service-ca\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.689647 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dff82d10-6428-408e-be1f-15df477faac8-metrics-tls\") pod \"ingress-operator-5b745b69d9-5fqgh\" (UID: \"dff82d10-6428-408e-be1f-15df477faac8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.689800 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-registry-tls\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.689840 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95688848-6ae6-4abe-a83a-0b43899c2b81-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mvbbx\" (UID: \"95688848-6ae6-4abe-a83a-0b43899c2b81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.689849 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4j2j\" (UniqueName: \"kubernetes.io/projected/7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad-kube-api-access-d4j2j\") pod \"package-server-manager-789f6589d5-stkv5\" (UID: \"7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.689951 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj8kv\" (UniqueName: \"kubernetes.io/projected/3542249d-1f2f-4814-8d0b-ba8b664f48d7-kube-api-access-gj8kv\") pod \"olm-operator-6b444d44fb-g7d59\" (UID: \"3542249d-1f2f-4814-8d0b-ba8b664f48d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.689988 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8frqk\" (UniqueName: \"kubernetes.io/projected/42df2d5b-cf53-4367-8f93-a231a07cd44e-kube-api-access-8frqk\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690018 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xbsb\" (UniqueName: \"kubernetes.io/projected/b05c3501-40da-4059-b7c7-5e3a7bd66ab1-kube-api-access-2xbsb\") pod \"machine-config-server-f5nlk\" (UID: \"b05c3501-40da-4059-b7c7-5e3a7bd66ab1\") " pod="openshift-machine-config-operator/machine-config-server-f5nlk" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690077 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-service-ca\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690107 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a38ca0f1-cbc5-42ab-b174-a7affc65a898-config\") pod \"service-ca-operator-777779d784-x4fts\" (UID: \"a38ca0f1-cbc5-42ab-b174-a7affc65a898\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4fts" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690179 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9664e6b0-4a34-483f-b219-9c7e7cc6d37b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-z4xlz\" (UID: \"9664e6b0-4a34-483f-b219-9c7e7cc6d37b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690211 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1b35345f-132f-4890-a37d-e0dab7f975b7-webhook-cert\") pod \"packageserver-d55dfcdfc-r5r2z\" (UID: \"1b35345f-132f-4890-a37d-e0dab7f975b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690238 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5gpp5\" (UID: \"ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690289 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd1e4468-40ba-4a47-8f89-99de7fec4071-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690321 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/619c15e7-db7d-4cbb-af79-5e52468bfc1a-machine-approver-tls\") pod \"machine-approver-56656f9798-m4r4x\" (UID: \"619c15e7-db7d-4cbb-af79-5e52468bfc1a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690596 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-etcd-client\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690679 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9639b151-8302-489d-bfc4-dc8d2e371363-serving-cert\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690691 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/088f5c34-691a-4adb-95f8-46052ba7241a-serving-cert\") pod \"openshift-config-operator-7777fb866f-nxmpm\" (UID: \"088f5c34-691a-4adb-95f8-46052ba7241a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690700 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd1e4468-40ba-4a47-8f89-99de7fec4071-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690732 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dff82d10-6428-408e-be1f-15df477faac8-trusted-ca\") pod \"ingress-operator-5b745b69d9-5fqgh\" (UID: \"dff82d10-6428-408e-be1f-15df477faac8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690764 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhg5r\" (UniqueName: \"kubernetes.io/projected/95688848-6ae6-4abe-a83a-0b43899c2b81-kube-api-access-jhg5r\") pod \"kube-storage-version-migrator-operator-b67b599dd-mvbbx\" (UID: \"95688848-6ae6-4abe-a83a-0b43899c2b81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.690799 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c6808721-c324-44b0-af75-7d438cc0d713-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qxwbr\" (UID: \"c6808721-c324-44b0-af75-7d438cc0d713\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.691017 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-service-ca\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.691230 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a38ca0f1-cbc5-42ab-b174-a7affc65a898-config\") pod \"service-ca-operator-777779d784-x4fts\" (UID: \"a38ca0f1-cbc5-42ab-b174-a7affc65a898\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4fts" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.691399 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42v22\" (UniqueName: \"kubernetes.io/projected/ade544d3-8b0e-43f3-b2c8-cbebd21f0405-kube-api-access-42v22\") pod \"dns-operator-744455d44c-p2n7t\" (UID: \"ade544d3-8b0e-43f3-b2c8-cbebd21f0405\") " pod="openshift-dns-operator/dns-operator-744455d44c-p2n7t" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.691430 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvzfg\" (UniqueName: \"kubernetes.io/projected/36e61ca6-b668-430c-9e68-00a308d192f0-kube-api-access-wvzfg\") pod \"migrator-59844c95c7-jgpzx\" (UID: \"36e61ca6-b668-430c-9e68-00a308d192f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgpzx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.691965 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9639b151-8302-489d-bfc4-dc8d2e371363-encryption-config\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.692208 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9cc74c00-b3a6-422c-8ad4-1547e0834e3f-srv-cert\") pod \"catalog-operator-68c6474976-5twnx\" (UID: \"9cc74c00-b3a6-422c-8ad4-1547e0834e3f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.692477 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbv5w\" (UniqueName: \"kubernetes.io/projected/9cc74c00-b3a6-422c-8ad4-1547e0834e3f-kube-api-access-wbv5w\") pod \"catalog-operator-68c6474976-5twnx\" (UID: \"9cc74c00-b3a6-422c-8ad4-1547e0834e3f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.692542 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33c9a78b-be05-4c57-b51a-11da91ed2503-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zcf4q\" (UID: \"33c9a78b-be05-4c57-b51a-11da91ed2503\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.692583 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2dz5\" (UniqueName: \"kubernetes.io/projected/a60b32b5-e9f3-4dd1-be69-05d4ec0789fe-kube-api-access-p2dz5\") pod \"downloads-7954f5f757-ssczm\" (UID: \"a60b32b5-e9f3-4dd1-be69-05d4ec0789fe\") " pod="openshift-console/downloads-7954f5f757-ssczm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.692669 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb64fb3e-2ed2-469a-8278-13be858098a1-service-ca-bundle\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.692706 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-csi-data-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.692742 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abdc8ed-965f-4219-9a7c-f18b65448445-config\") pod \"console-operator-58897d9998-7f575\" (UID: \"3abdc8ed-965f-4219-9a7c-f18b65448445\") " pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.692778 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pghvl\" (UniqueName: \"kubernetes.io/projected/ceea773f-549c-4d23-841c-a8e2ccb62f28-kube-api-access-pghvl\") pod \"marketplace-operator-79b997595-vbrpv\" (UID: \"ceea773f-549c-4d23-841c-a8e2ccb62f28\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.692860 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd1e4468-40ba-4a47-8f89-99de7fec4071-trusted-ca\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.692904 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ade544d3-8b0e-43f3-b2c8-cbebd21f0405-metrics-tls\") pod \"dns-operator-744455d44c-p2n7t\" (UID: \"ade544d3-8b0e-43f3-b2c8-cbebd21f0405\") " pod="openshift-dns-operator/dns-operator-744455d44c-p2n7t" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.692939 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9639b151-8302-489d-bfc4-dc8d2e371363-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.693036 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls658\" (UniqueName: \"kubernetes.io/projected/619c15e7-db7d-4cbb-af79-5e52468bfc1a-kube-api-access-ls658\") pod \"machine-approver-56656f9798-m4r4x\" (UID: \"619c15e7-db7d-4cbb-af79-5e52468bfc1a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.693081 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9689e1f9-5b48-47da-af2f-dc1db858196d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bkgdg\" (UID: \"9689e1f9-5b48-47da-af2f-dc1db858196d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkgdg" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.693109 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-trusted-ca-bundle\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.693130 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dff82d10-6428-408e-be1f-15df477faac8-trusted-ca\") pod \"ingress-operator-5b745b69d9-5fqgh\" (UID: \"dff82d10-6428-408e-be1f-15df477faac8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.693137 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-config\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.693211 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-oauth-config\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.693383 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9639b151-8302-489d-bfc4-dc8d2e371363-etcd-client\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.693744 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb64fb3e-2ed2-469a-8278-13be858098a1-service-ca-bundle\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.694041 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-config\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.694417 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6fd5584-6878-4be8-83bb-f61003df2639-config-volume\") pod \"collect-profiles-29322030-jlrxc\" (UID: \"e6fd5584-6878-4be8-83bb-f61003df2639\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.694453 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3abdc8ed-965f-4219-9a7c-f18b65448445-trusted-ca\") pod \"console-operator-58897d9998-7f575\" (UID: \"3abdc8ed-965f-4219-9a7c-f18b65448445\") " pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.694597 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd1e4468-40ba-4a47-8f89-99de7fec4071-trusted-ca\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.694923 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9639b151-8302-489d-bfc4-dc8d2e371363-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.694986 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhmvq\" (UniqueName: \"kubernetes.io/projected/eb64fb3e-2ed2-469a-8278-13be858098a1-kube-api-access-lhmvq\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.695051 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8c97\" (UniqueName: \"kubernetes.io/projected/92b2bff6-3b61-4d3b-8d88-9077b02ed990-kube-api-access-g8c97\") pod \"machine-api-operator-5694c8668f-fm7mq\" (UID: \"92b2bff6-3b61-4d3b-8d88-9077b02ed990\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.695089 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-etcd-ca\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.695116 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dff82d10-6428-408e-be1f-15df477faac8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5fqgh\" (UID: \"dff82d10-6428-408e-be1f-15df477faac8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.695204 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-registration-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.695240 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xddzm\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-kube-api-access-xddzm\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.695510 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6fd5584-6878-4be8-83bb-f61003df2639-config-volume\") pod \"collect-profiles-29322030-jlrxc\" (UID: \"e6fd5584-6878-4be8-83bb-f61003df2639\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.695810 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/92b2bff6-3b61-4d3b-8d88-9077b02ed990-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fm7mq\" (UID: \"92b2bff6-3b61-4d3b-8d88-9077b02ed990\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.696054 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edb0a98c-923f-42cf-9f95-fbb4eabc8c73-proxy-tls\") pod \"machine-config-controller-84d6567774-sg85m\" (UID: \"edb0a98c-923f-42cf-9f95-fbb4eabc8c73\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.696431 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-etcd-ca\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.697022 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33c9a78b-be05-4c57-b51a-11da91ed2503-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zcf4q\" (UID: \"33c9a78b-be05-4c57-b51a-11da91ed2503\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.697160 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-registry-tls\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.697254 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5gpp5\" (UID: \"ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.698000 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/619c15e7-db7d-4cbb-af79-5e52468bfc1a-machine-approver-tls\") pod \"machine-approver-56656f9798-m4r4x\" (UID: \"619c15e7-db7d-4cbb-af79-5e52468bfc1a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.698470 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9cc74c00-b3a6-422c-8ad4-1547e0834e3f-profile-collector-cert\") pod \"catalog-operator-68c6474976-5twnx\" (UID: \"9cc74c00-b3a6-422c-8ad4-1547e0834e3f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.698627 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ade544d3-8b0e-43f3-b2c8-cbebd21f0405-metrics-tls\") pod \"dns-operator-744455d44c-p2n7t\" (UID: \"ade544d3-8b0e-43f3-b2c8-cbebd21f0405\") " pod="openshift-dns-operator/dns-operator-744455d44c-p2n7t" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.698974 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/088f5c34-691a-4adb-95f8-46052ba7241a-serving-cert\") pod \"openshift-config-operator-7777fb866f-nxmpm\" (UID: \"088f5c34-691a-4adb-95f8-46052ba7241a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.699026 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-trusted-ca-bundle\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.717808 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6frqc\" (UniqueName: \"kubernetes.io/projected/3cfbc937-568e-42d3-9d14-6dec309b3eed-kube-api-access-6frqc\") pod \"cluster-image-registry-operator-dc59b4c8b-x5dnm\" (UID: \"3cfbc937-568e-42d3-9d14-6dec309b3eed\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" Oct 01 12:40:09 crc kubenswrapper[4913]: W1001 12:40:09.728679 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb7d7281_96c5_4a7d_b57f_769fbafba858.slice/crio-824ea4dcc829ba1d015144f5c2015d0b6ff63b3594fb6ff7a71ce56551204f17 WatchSource:0}: Error finding container 824ea4dcc829ba1d015144f5c2015d0b6ff63b3594fb6ff7a71ce56551204f17: Status 404 returned error can't find the container with id 824ea4dcc829ba1d015144f5c2015d0b6ff63b3594fb6ff7a71ce56551204f17 Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.740662 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26wmr\" (UniqueName: \"kubernetes.io/projected/a38ca0f1-cbc5-42ab-b174-a7affc65a898-kube-api-access-26wmr\") pod \"service-ca-operator-777779d784-x4fts\" (UID: \"a38ca0f1-cbc5-42ab-b174-a7affc65a898\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4fts" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.759163 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh24b\" (UniqueName: \"kubernetes.io/projected/9639b151-8302-489d-bfc4-dc8d2e371363-kube-api-access-mh24b\") pod \"apiserver-7bbb656c7d-75z6d\" (UID: \"9639b151-8302-489d-bfc4-dc8d2e371363\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.785121 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv98j\" (UniqueName: \"kubernetes.io/projected/edb0a98c-923f-42cf-9f95-fbb4eabc8c73-kube-api-access-pv98j\") pod \"machine-config-controller-84d6567774-sg85m\" (UID: \"edb0a98c-923f-42cf-9f95-fbb4eabc8c73\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.794009 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4fts" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.796751 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797012 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl6l6\" (UniqueName: \"kubernetes.io/projected/9689e1f9-5b48-47da-af2f-dc1db858196d-kube-api-access-fl6l6\") pod \"control-plane-machine-set-operator-78cbb6b69f-bkgdg\" (UID: \"9689e1f9-5b48-47da-af2f-dc1db858196d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkgdg" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797053 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3-signing-cabundle\") pod \"service-ca-9c57cc56f-kssl9\" (UID: \"8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3\") " pod="openshift-service-ca/service-ca-9c57cc56f-kssl9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797082 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48dc2da8-f187-4dde-8dd3-51f29b49c80a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vf784\" (UID: \"48dc2da8-f187-4dde-8dd3-51f29b49c80a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797109 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/088bf733-26d8-4478-b6d8-346657f863ac-metrics-certs\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797138 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/088bf733-26d8-4478-b6d8-346657f863ac-service-ca-bundle\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797184 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b05c3501-40da-4059-b7c7-5e3a7bd66ab1-node-bootstrap-token\") pod \"machine-config-server-f5nlk\" (UID: \"b05c3501-40da-4059-b7c7-5e3a7bd66ab1\") " pod="openshift-machine-config-operator/machine-config-server-f5nlk" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797221 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-mountpoint-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797249 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9664e6b0-4a34-483f-b219-9c7e7cc6d37b-config\") pod \"kube-controller-manager-operator-78b949d7b-z4xlz\" (UID: \"9664e6b0-4a34-483f-b219-9c7e7cc6d37b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797301 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-stkv5\" (UID: \"7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797329 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjgb6\" (UniqueName: \"kubernetes.io/projected/3abdc8ed-965f-4219-9a7c-f18b65448445-kube-api-access-qjgb6\") pod \"console-operator-58897d9998-7f575\" (UID: \"3abdc8ed-965f-4219-9a7c-f18b65448445\") " pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797366 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48dc2da8-f187-4dde-8dd3-51f29b49c80a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vf784\" (UID: \"48dc2da8-f187-4dde-8dd3-51f29b49c80a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797397 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61fd05a4-3c42-4fe4-8bce-486b538bab4c-metrics-tls\") pod \"dns-default-vqbl5\" (UID: \"61fd05a4-3c42-4fe4-8bce-486b538bab4c\") " pod="openshift-dns/dns-default-vqbl5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797433 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abdc8ed-965f-4219-9a7c-f18b65448445-serving-cert\") pod \"console-operator-58897d9998-7f575\" (UID: \"3abdc8ed-965f-4219-9a7c-f18b65448445\") " pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797460 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knw27\" (UniqueName: \"kubernetes.io/projected/1b35345f-132f-4890-a37d-e0dab7f975b7-kube-api-access-knw27\") pod \"packageserver-d55dfcdfc-r5r2z\" (UID: \"1b35345f-132f-4890-a37d-e0dab7f975b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797502 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbmxl\" (UniqueName: \"kubernetes.io/projected/dbbcc58c-99fe-4c00-bc81-64d399027e66-kube-api-access-cbmxl\") pod \"multus-admission-controller-857f4d67dd-jrp62\" (UID: \"dbbcc58c-99fe-4c00-bc81-64d399027e66\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrp62" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797525 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/088bf733-26d8-4478-b6d8-346657f863ac-default-certificate\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797553 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3542249d-1f2f-4814-8d0b-ba8b664f48d7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g7d59\" (UID: \"3542249d-1f2f-4814-8d0b-ba8b664f48d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797573 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-socket-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797593 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1b35345f-132f-4890-a37d-e0dab7f975b7-apiservice-cert\") pod \"packageserver-d55dfcdfc-r5r2z\" (UID: \"1b35345f-132f-4890-a37d-e0dab7f975b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797619 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c6808721-c324-44b0-af75-7d438cc0d713-images\") pod \"machine-config-operator-74547568cd-qxwbr\" (UID: \"c6808721-c324-44b0-af75-7d438cc0d713\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797665 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6808721-c324-44b0-af75-7d438cc0d713-proxy-tls\") pod \"machine-config-operator-74547568cd-qxwbr\" (UID: \"c6808721-c324-44b0-af75-7d438cc0d713\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797691 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ceea773f-549c-4d23-841c-a8e2ccb62f28-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vbrpv\" (UID: \"ceea773f-549c-4d23-841c-a8e2ccb62f28\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797714 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3-signing-key\") pod \"service-ca-9c57cc56f-kssl9\" (UID: \"8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3\") " pod="openshift-service-ca/service-ca-9c57cc56f-kssl9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797735 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b05c3501-40da-4059-b7c7-5e3a7bd66ab1-certs\") pod \"machine-config-server-f5nlk\" (UID: \"b05c3501-40da-4059-b7c7-5e3a7bd66ab1\") " pod="openshift-machine-config-operator/machine-config-server-f5nlk" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797755 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/61fd05a4-3c42-4fe4-8bce-486b538bab4c-config-volume\") pod \"dns-default-vqbl5\" (UID: \"61fd05a4-3c42-4fe4-8bce-486b538bab4c\") " pod="openshift-dns/dns-default-vqbl5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797788 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1b35345f-132f-4890-a37d-e0dab7f975b7-tmpfs\") pod \"packageserver-d55dfcdfc-r5r2z\" (UID: \"1b35345f-132f-4890-a37d-e0dab7f975b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797817 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-plugins-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797849 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3542249d-1f2f-4814-8d0b-ba8b664f48d7-srv-cert\") pod \"olm-operator-6b444d44fb-g7d59\" (UID: \"3542249d-1f2f-4814-8d0b-ba8b664f48d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797880 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9664e6b0-4a34-483f-b219-9c7e7cc6d37b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-z4xlz\" (UID: \"9664e6b0-4a34-483f-b219-9c7e7cc6d37b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797904 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ceea773f-549c-4d23-841c-a8e2ccb62f28-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vbrpv\" (UID: \"ceea773f-549c-4d23-841c-a8e2ccb62f28\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797929 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d48f7ca-f286-49f9-8541-bc186e440dfa-cert\") pod \"ingress-canary-tjv45\" (UID: \"4d48f7ca-f286-49f9-8541-bc186e440dfa\") " pod="openshift-ingress-canary/ingress-canary-tjv45" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797952 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb2hf\" (UniqueName: \"kubernetes.io/projected/4d48f7ca-f286-49f9-8541-bc186e440dfa-kube-api-access-zb2hf\") pod \"ingress-canary-tjv45\" (UID: \"4d48f7ca-f286-49f9-8541-bc186e440dfa\") " pod="openshift-ingress-canary/ingress-canary-tjv45" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.797987 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dbbcc58c-99fe-4c00-bc81-64d399027e66-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jrp62\" (UID: \"dbbcc58c-99fe-4c00-bc81-64d399027e66\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrp62" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.798012 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/088bf733-26d8-4478-b6d8-346657f863ac-stats-auth\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.798053 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s75h\" (UniqueName: \"kubernetes.io/projected/61fd05a4-3c42-4fe4-8bce-486b538bab4c-kube-api-access-7s75h\") pod \"dns-default-vqbl5\" (UID: \"61fd05a4-3c42-4fe4-8bce-486b538bab4c\") " pod="openshift-dns/dns-default-vqbl5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.798079 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x9zt\" (UniqueName: \"kubernetes.io/projected/c6808721-c324-44b0-af75-7d438cc0d713-kube-api-access-8x9zt\") pod \"machine-config-operator-74547568cd-qxwbr\" (UID: \"c6808721-c324-44b0-af75-7d438cc0d713\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.798106 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sjbc\" (UniqueName: \"kubernetes.io/projected/088bf733-26d8-4478-b6d8-346657f863ac-kube-api-access-9sjbc\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.798123 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nwks\" (UniqueName: \"kubernetes.io/projected/8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3-kube-api-access-8nwks\") pod \"service-ca-9c57cc56f-kssl9\" (UID: \"8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3\") " pod="openshift-service-ca/service-ca-9c57cc56f-kssl9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.798147 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48dc2da8-f187-4dde-8dd3-51f29b49c80a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vf784\" (UID: \"48dc2da8-f187-4dde-8dd3-51f29b49c80a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.798166 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4j2j\" (UniqueName: \"kubernetes.io/projected/7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad-kube-api-access-d4j2j\") pod \"package-server-manager-789f6589d5-stkv5\" (UID: \"7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.798185 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj8kv\" (UniqueName: \"kubernetes.io/projected/3542249d-1f2f-4814-8d0b-ba8b664f48d7-kube-api-access-gj8kv\") pod \"olm-operator-6b444d44fb-g7d59\" (UID: \"3542249d-1f2f-4814-8d0b-ba8b664f48d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.798202 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8frqk\" (UniqueName: \"kubernetes.io/projected/42df2d5b-cf53-4367-8f93-a231a07cd44e-kube-api-access-8frqk\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.798218 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xbsb\" (UniqueName: \"kubernetes.io/projected/b05c3501-40da-4059-b7c7-5e3a7bd66ab1-kube-api-access-2xbsb\") pod \"machine-config-server-f5nlk\" (UID: \"b05c3501-40da-4059-b7c7-5e3a7bd66ab1\") " pod="openshift-machine-config-operator/machine-config-server-f5nlk" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.798237 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9664e6b0-4a34-483f-b219-9c7e7cc6d37b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-z4xlz\" (UID: \"9664e6b0-4a34-483f-b219-9c7e7cc6d37b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.798252 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1b35345f-132f-4890-a37d-e0dab7f975b7-webhook-cert\") pod \"packageserver-d55dfcdfc-r5r2z\" (UID: \"1b35345f-132f-4890-a37d-e0dab7f975b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:09 crc kubenswrapper[4913]: E1001 12:40:09.799064 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:10.299037078 +0000 UTC m=+142.202512666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.799572 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-mountpoint-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.800201 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48dc2da8-f187-4dde-8dd3-51f29b49c80a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vf784\" (UID: \"48dc2da8-f187-4dde-8dd3-51f29b49c80a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.800461 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ceea773f-549c-4d23-841c-a8e2ccb62f28-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vbrpv\" (UID: \"ceea773f-549c-4d23-841c-a8e2ccb62f28\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.800941 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/088bf733-26d8-4478-b6d8-346657f863ac-service-ca-bundle\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.801015 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9664e6b0-4a34-483f-b219-9c7e7cc6d37b-config\") pod \"kube-controller-manager-operator-78b949d7b-z4xlz\" (UID: \"9664e6b0-4a34-483f-b219-9c7e7cc6d37b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.801191 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1b35345f-132f-4890-a37d-e0dab7f975b7-tmpfs\") pod \"packageserver-d55dfcdfc-r5r2z\" (UID: \"1b35345f-132f-4890-a37d-e0dab7f975b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.801351 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-plugins-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.802657 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c6808721-c324-44b0-af75-7d438cc0d713-images\") pod \"machine-config-operator-74547568cd-qxwbr\" (UID: \"c6808721-c324-44b0-af75-7d438cc0d713\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.802751 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/61fd05a4-3c42-4fe4-8bce-486b538bab4c-config-volume\") pod \"dns-default-vqbl5\" (UID: \"61fd05a4-3c42-4fe4-8bce-486b538bab4c\") " pod="openshift-dns/dns-default-vqbl5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.803046 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-socket-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.803635 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61fd05a4-3c42-4fe4-8bce-486b538bab4c-metrics-tls\") pod \"dns-default-vqbl5\" (UID: \"61fd05a4-3c42-4fe4-8bce-486b538bab4c\") " pod="openshift-dns/dns-default-vqbl5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.804402 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krkxs\" (UniqueName: \"kubernetes.io/projected/ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea-kube-api-access-krkxs\") pod \"etcd-operator-b45778765-4fjzp\" (UID: \"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.804547 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c6808721-c324-44b0-af75-7d438cc0d713-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qxwbr\" (UID: \"c6808721-c324-44b0-af75-7d438cc0d713\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.804562 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3-signing-cabundle\") pod \"service-ca-9c57cc56f-kssl9\" (UID: \"8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3\") " pod="openshift-service-ca/service-ca-9c57cc56f-kssl9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.804695 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-csi-data-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.804717 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abdc8ed-965f-4219-9a7c-f18b65448445-config\") pod \"console-operator-58897d9998-7f575\" (UID: \"3abdc8ed-965f-4219-9a7c-f18b65448445\") " pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.804736 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pghvl\" (UniqueName: \"kubernetes.io/projected/ceea773f-549c-4d23-841c-a8e2ccb62f28-kube-api-access-pghvl\") pod \"marketplace-operator-79b997595-vbrpv\" (UID: \"ceea773f-549c-4d23-841c-a8e2ccb62f28\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.804814 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9689e1f9-5b48-47da-af2f-dc1db858196d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bkgdg\" (UID: \"9689e1f9-5b48-47da-af2f-dc1db858196d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkgdg" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.805050 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/088bf733-26d8-4478-b6d8-346657f863ac-metrics-certs\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.805073 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3abdc8ed-965f-4219-9a7c-f18b65448445-trusted-ca\") pod \"console-operator-58897d9998-7f575\" (UID: \"3abdc8ed-965f-4219-9a7c-f18b65448445\") " pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.805311 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-registration-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.805928 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-csi-data-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.806330 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3abdc8ed-965f-4219-9a7c-f18b65448445-trusted-ca\") pod \"console-operator-58897d9998-7f575\" (UID: \"3abdc8ed-965f-4219-9a7c-f18b65448445\") " pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.806553 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c6808721-c324-44b0-af75-7d438cc0d713-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qxwbr\" (UID: \"c6808721-c324-44b0-af75-7d438cc0d713\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.807494 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42df2d5b-cf53-4367-8f93-a231a07cd44e-registration-dir\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.807671 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/088bf733-26d8-4478-b6d8-346657f863ac-default-certificate\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.808050 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abdc8ed-965f-4219-9a7c-f18b65448445-config\") pod \"console-operator-58897d9998-7f575\" (UID: \"3abdc8ed-965f-4219-9a7c-f18b65448445\") " pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.808096 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b05c3501-40da-4059-b7c7-5e3a7bd66ab1-certs\") pod \"machine-config-server-f5nlk\" (UID: \"b05c3501-40da-4059-b7c7-5e3a7bd66ab1\") " pod="openshift-machine-config-operator/machine-config-server-f5nlk" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.808210 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48dc2da8-f187-4dde-8dd3-51f29b49c80a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vf784\" (UID: \"48dc2da8-f187-4dde-8dd3-51f29b49c80a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.808747 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1b35345f-132f-4890-a37d-e0dab7f975b7-webhook-cert\") pod \"packageserver-d55dfcdfc-r5r2z\" (UID: \"1b35345f-132f-4890-a37d-e0dab7f975b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.809168 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3-signing-key\") pod \"service-ca-9c57cc56f-kssl9\" (UID: \"8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3\") " pod="openshift-service-ca/service-ca-9c57cc56f-kssl9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.809401 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-stkv5\" (UID: \"7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.809996 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1b35345f-132f-4890-a37d-e0dab7f975b7-apiservice-cert\") pod \"packageserver-d55dfcdfc-r5r2z\" (UID: \"1b35345f-132f-4890-a37d-e0dab7f975b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.810440 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6808721-c324-44b0-af75-7d438cc0d713-proxy-tls\") pod \"machine-config-operator-74547568cd-qxwbr\" (UID: \"c6808721-c324-44b0-af75-7d438cc0d713\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.810546 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ceea773f-549c-4d23-841c-a8e2ccb62f28-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vbrpv\" (UID: \"ceea773f-549c-4d23-841c-a8e2ccb62f28\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.810617 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abdc8ed-965f-4219-9a7c-f18b65448445-serving-cert\") pod \"console-operator-58897d9998-7f575\" (UID: \"3abdc8ed-965f-4219-9a7c-f18b65448445\") " pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.810716 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b05c3501-40da-4059-b7c7-5e3a7bd66ab1-node-bootstrap-token\") pod \"machine-config-server-f5nlk\" (UID: \"b05c3501-40da-4059-b7c7-5e3a7bd66ab1\") " pod="openshift-machine-config-operator/machine-config-server-f5nlk" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.810853 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d48f7ca-f286-49f9-8541-bc186e440dfa-cert\") pod \"ingress-canary-tjv45\" (UID: \"4d48f7ca-f286-49f9-8541-bc186e440dfa\") " pod="openshift-ingress-canary/ingress-canary-tjv45" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.811157 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3542249d-1f2f-4814-8d0b-ba8b664f48d7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g7d59\" (UID: \"3542249d-1f2f-4814-8d0b-ba8b664f48d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.812446 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dbbcc58c-99fe-4c00-bc81-64d399027e66-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jrp62\" (UID: \"dbbcc58c-99fe-4c00-bc81-64d399027e66\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrp62" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.812658 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9689e1f9-5b48-47da-af2f-dc1db858196d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bkgdg\" (UID: \"9689e1f9-5b48-47da-af2f-dc1db858196d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkgdg" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.812685 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/088bf733-26d8-4478-b6d8-346657f863ac-stats-auth\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.813564 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9664e6b0-4a34-483f-b219-9c7e7cc6d37b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-z4xlz\" (UID: \"9664e6b0-4a34-483f-b219-9c7e7cc6d37b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.815715 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3542249d-1f2f-4814-8d0b-ba8b664f48d7-srv-cert\") pod \"olm-operator-6b444d44fb-g7d59\" (UID: \"3542249d-1f2f-4814-8d0b-ba8b664f48d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.822788 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5gpp5\" (UID: \"ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.837697 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26rbc\" (UniqueName: \"kubernetes.io/projected/088f5c34-691a-4adb-95f8-46052ba7241a-kube-api-access-26rbc\") pod \"openshift-config-operator-7777fb866f-nxmpm\" (UID: \"088f5c34-691a-4adb-95f8-46052ba7241a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.848142 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.865655 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txk79\" (UniqueName: \"kubernetes.io/projected/dff82d10-6428-408e-be1f-15df477faac8-kube-api-access-txk79\") pod \"ingress-operator-5b745b69d9-5fqgh\" (UID: \"dff82d10-6428-408e-be1f-15df477faac8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.881246 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp965\" (UniqueName: \"kubernetes.io/projected/4d2bd20a-3d8d-4073-aca4-ceca547c186f-kube-api-access-gp965\") pod \"console-f9d7485db-97mb9\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.899399 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-bound-sa-token\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.906889 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:09 crc kubenswrapper[4913]: E1001 12:40:09.907230 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:10.407214475 +0000 UTC m=+142.310690053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.927243 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlxkh\" (UniqueName: \"kubernetes.io/projected/e6fd5584-6878-4be8-83bb-f61003df2639-kube-api-access-tlxkh\") pod \"collect-profiles-29322030-jlrxc\" (UID: \"e6fd5584-6878-4be8-83bb-f61003df2639\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.941830 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt6b6\" (UniqueName: \"kubernetes.io/projected/33c9a78b-be05-4c57-b51a-11da91ed2503-kube-api-access-nt6b6\") pod \"openshift-controller-manager-operator-756b6f6bc6-zcf4q\" (UID: \"33c9a78b-be05-4c57-b51a-11da91ed2503\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.946678 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.971972 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x4fts"] Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.980869 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3cfbc937-568e-42d3-9d14-6dec309b3eed-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-x5dnm\" (UID: \"3cfbc937-568e-42d3-9d14-6dec309b3eed\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" Oct 01 12:40:09 crc kubenswrapper[4913]: W1001 12:40:09.982243 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda38ca0f1_cbc5_42ab_b174_a7affc65a898.slice/crio-f711049a2168c671c3c19c54860e7cee597f76d1471642da58bf4ac831c4b131 WatchSource:0}: Error finding container f711049a2168c671c3c19c54860e7cee597f76d1471642da58bf4ac831c4b131: Status 404 returned error can't find the container with id f711049a2168c671c3c19c54860e7cee597f76d1471642da58bf4ac831c4b131 Oct 01 12:40:09 crc kubenswrapper[4913]: I1001 12:40:09.999485 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhg5r\" (UniqueName: \"kubernetes.io/projected/95688848-6ae6-4abe-a83a-0b43899c2b81-kube-api-access-jhg5r\") pod \"kube-storage-version-migrator-operator-b67b599dd-mvbbx\" (UID: \"95688848-6ae6-4abe-a83a-0b43899c2b81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.002643 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.007963 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.008118 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:10.508085464 +0000 UTC m=+142.411561052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.008310 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.008647 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:10.508636208 +0000 UTC m=+142.412111786 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.018023 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.018715 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42v22\" (UniqueName: \"kubernetes.io/projected/ade544d3-8b0e-43f3-b2c8-cbebd21f0405-kube-api-access-42v22\") pod \"dns-operator-744455d44c-p2n7t\" (UID: \"ade544d3-8b0e-43f3-b2c8-cbebd21f0405\") " pod="openshift-dns-operator/dns-operator-744455d44c-p2n7t" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.046343 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvzfg\" (UniqueName: \"kubernetes.io/projected/36e61ca6-b668-430c-9e68-00a308d192f0-kube-api-access-wvzfg\") pod \"migrator-59844c95c7-jgpzx\" (UID: \"36e61ca6-b668-430c-9e68-00a308d192f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgpzx" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.059860 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgpzx" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.060350 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.061870 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.063614 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbv5w\" (UniqueName: \"kubernetes.io/projected/9cc74c00-b3a6-422c-8ad4-1547e0834e3f-kube-api-access-wbv5w\") pod \"catalog-operator-68c6474976-5twnx\" (UID: \"9cc74c00-b3a6-422c-8ad4-1547e0834e3f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.068835 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d"] Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.070276 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.074986 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.082102 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.085661 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls658\" (UniqueName: \"kubernetes.io/projected/619c15e7-db7d-4cbb-af79-5e52468bfc1a-kube-api-access-ls658\") pod \"machine-approver-56656f9798-m4r4x\" (UID: \"619c15e7-db7d-4cbb-af79-5e52468bfc1a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.087474 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-p2n7t" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.100557 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xddzm\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-kube-api-access-xddzm\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.106804 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.110453 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.110638 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:10.610589486 +0000 UTC m=+142.514065064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.111080 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.111474 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:10.61146369 +0000 UTC m=+142.514939478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.123166 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dff82d10-6428-408e-be1f-15df477faac8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5fqgh\" (UID: \"dff82d10-6428-408e-be1f-15df477faac8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.141387 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhmvq\" (UniqueName: \"kubernetes.io/projected/eb64fb3e-2ed2-469a-8278-13be858098a1-kube-api-access-lhmvq\") pod \"authentication-operator-69f744f599-rswcz\" (UID: \"eb64fb3e-2ed2-469a-8278-13be858098a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.166629 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8c97\" (UniqueName: \"kubernetes.io/projected/92b2bff6-3b61-4d3b-8d88-9077b02ed990-kube-api-access-g8c97\") pod \"machine-api-operator-5694c8668f-fm7mq\" (UID: \"92b2bff6-3b61-4d3b-8d88-9077b02ed990\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.182407 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.183801 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2dz5\" (UniqueName: \"kubernetes.io/projected/a60b32b5-e9f3-4dd1-be69-05d4ec0789fe-kube-api-access-p2dz5\") pod \"downloads-7954f5f757-ssczm\" (UID: \"a60b32b5-e9f3-4dd1-be69-05d4ec0789fe\") " pod="openshift-console/downloads-7954f5f757-ssczm" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.205838 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjgb6\" (UniqueName: \"kubernetes.io/projected/3abdc8ed-965f-4219-9a7c-f18b65448445-kube-api-access-qjgb6\") pod \"console-operator-58897d9998-7f575\" (UID: \"3abdc8ed-965f-4219-9a7c-f18b65448445\") " pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.212161 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.212573 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:10.712558875 +0000 UTC m=+142.616034453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.222756 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.225403 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48dc2da8-f187-4dde-8dd3-51f29b49c80a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vf784\" (UID: \"48dc2da8-f187-4dde-8dd3-51f29b49c80a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.239550 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl6l6\" (UniqueName: \"kubernetes.io/projected/9689e1f9-5b48-47da-af2f-dc1db858196d-kube-api-access-fl6l6\") pod \"control-plane-machine-set-operator-78cbb6b69f-bkgdg\" (UID: \"9689e1f9-5b48-47da-af2f-dc1db858196d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkgdg" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.270051 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sjbc\" (UniqueName: \"kubernetes.io/projected/088bf733-26d8-4478-b6d8-346657f863ac-kube-api-access-9sjbc\") pod \"router-default-5444994796-jk8wn\" (UID: \"088bf733-26d8-4478-b6d8-346657f863ac\") " pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.277085 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb2hf\" (UniqueName: \"kubernetes.io/projected/4d48f7ca-f286-49f9-8541-bc186e440dfa-kube-api-access-zb2hf\") pod \"ingress-canary-tjv45\" (UID: \"4d48f7ca-f286-49f9-8541-bc186e440dfa\") " pod="openshift-ingress-canary/ingress-canary-tjv45" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.311923 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ssczm" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.313628 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.314097 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:10.81407694 +0000 UTC m=+142.717552578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.321121 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s75h\" (UniqueName: \"kubernetes.io/projected/61fd05a4-3c42-4fe4-8bce-486b538bab4c-kube-api-access-7s75h\") pod \"dns-default-vqbl5\" (UID: \"61fd05a4-3c42-4fe4-8bce-486b538bab4c\") " pod="openshift-dns/dns-default-vqbl5" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.327377 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.327858 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x9zt\" (UniqueName: \"kubernetes.io/projected/c6808721-c324-44b0-af75-7d438cc0d713-kube-api-access-8x9zt\") pod \"machine-config-operator-74547568cd-qxwbr\" (UID: \"c6808721-c324-44b0-af75-7d438cc0d713\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.335007 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.356794 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8frqk\" (UniqueName: \"kubernetes.io/projected/42df2d5b-cf53-4367-8f93-a231a07cd44e-kube-api-access-8frqk\") pod \"csi-hostpathplugin-78qn2\" (UID: \"42df2d5b-cf53-4367-8f93-a231a07cd44e\") " pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.398068 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nwks\" (UniqueName: \"kubernetes.io/projected/8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3-kube-api-access-8nwks\") pod \"service-ca-9c57cc56f-kssl9\" (UID: \"8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3\") " pod="openshift-service-ca/service-ca-9c57cc56f-kssl9" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.398068 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knw27\" (UniqueName: \"kubernetes.io/projected/1b35345f-132f-4890-a37d-e0dab7f975b7-kube-api-access-knw27\") pod \"packageserver-d55dfcdfc-r5r2z\" (UID: \"1b35345f-132f-4890-a37d-e0dab7f975b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.399827 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.408390 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4j2j\" (UniqueName: \"kubernetes.io/projected/7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad-kube-api-access-d4j2j\") pod \"package-server-manager-789f6589d5-stkv5\" (UID: \"7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.422457 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.423010 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:10.922983327 +0000 UTC m=+142.826458905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.425499 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.425865 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:10.925846915 +0000 UTC m=+142.829322503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.427796 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.433386 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xbsb\" (UniqueName: \"kubernetes.io/projected/b05c3501-40da-4059-b7c7-5e3a7bd66ab1-kube-api-access-2xbsb\") pod \"machine-config-server-f5nlk\" (UID: \"b05c3501-40da-4059-b7c7-5e3a7bd66ab1\") " pod="openshift-machine-config-operator/machine-config-server-f5nlk" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.433661 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.439102 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kssl9" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.446171 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.458423 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.464423 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj8kv\" (UniqueName: \"kubernetes.io/projected/3542249d-1f2f-4814-8d0b-ba8b664f48d7-kube-api-access-gj8kv\") pod \"olm-operator-6b444d44fb-g7d59\" (UID: \"3542249d-1f2f-4814-8d0b-ba8b664f48d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.465769 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.467760 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9664e6b0-4a34-483f-b219-9c7e7cc6d37b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-z4xlz\" (UID: \"9664e6b0-4a34-483f-b219-9c7e7cc6d37b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.472027 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkgdg" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.474646 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-97mb9"] Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.479617 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbmxl\" (UniqueName: \"kubernetes.io/projected/dbbcc58c-99fe-4c00-bc81-64d399027e66-kube-api-access-cbmxl\") pod \"multus-admission-controller-857f4d67dd-jrp62\" (UID: \"dbbcc58c-99fe-4c00-bc81-64d399027e66\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrp62" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.486536 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.493910 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tjv45" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.509018 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm"] Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.513749 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-78qn2" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.514810 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pghvl\" (UniqueName: \"kubernetes.io/projected/ceea773f-549c-4d23-841c-a8e2ccb62f28-kube-api-access-pghvl\") pod \"marketplace-operator-79b997595-vbrpv\" (UID: \"ceea773f-549c-4d23-841c-a8e2ccb62f28\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.518793 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-f5nlk" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.523559 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vqbl5" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.527584 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.528194 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:11.028179432 +0000 UTC m=+142.931655010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.551177 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" event={"ID":"c349f466-f6f2-44a8-aea1-090f74dd7abe","Type":"ContainerStarted","Data":"104b71e954b73aef09369d456f60062cab22dd19f2429c3bc4161dac370a191f"} Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.551215 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" event={"ID":"c349f466-f6f2-44a8-aea1-090f74dd7abe","Type":"ContainerStarted","Data":"d0cb582dd46430ff2da43d850435f9e5dbe11801279118e90c12c5f24c79f5c3"} Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.551567 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.553441 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" event={"ID":"619c15e7-db7d-4cbb-af79-5e52468bfc1a","Type":"ContainerStarted","Data":"192f63e094f8ab20568844ad181bba44ad2bf5203bd482b47f8e9425f8b0ff3d"} Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.559384 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4fts" event={"ID":"a38ca0f1-cbc5-42ab-b174-a7affc65a898","Type":"ContainerStarted","Data":"85c7d2ccb44a38d251ef1f0044c32443df048915d88827461e0eaef43841404d"} Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.559428 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4fts" event={"ID":"a38ca0f1-cbc5-42ab-b174-a7affc65a898","Type":"ContainerStarted","Data":"f711049a2168c671c3c19c54860e7cee597f76d1471642da58bf4ac831c4b131"} Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.564084 4913 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-qjc8c container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" start-of-body= Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.564136 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" podUID="c349f466-f6f2-44a8-aea1-090f74dd7abe" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.567872 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt" event={"ID":"f5cde242-eb9a-4376-b458-5054299e53e0","Type":"ContainerStarted","Data":"253f0e508b33ff4fcec286237c990da4b7bfa534723cfe585db1e62b43f6927a"} Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.567904 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt" event={"ID":"f5cde242-eb9a-4376-b458-5054299e53e0","Type":"ContainerStarted","Data":"f6eb06027e10da770f65ee7fce52fc855de56a4d3327908a869a6cfc69aa3ae3"} Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.569608 4913 generic.go:334] "Generic (PLEG): container finished" podID="60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e" containerID="d7fc7e7846610fba509f07151486bfd4ed6ef24eea59a148ee762be863eed91d" exitCode=0 Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.569715 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" event={"ID":"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e","Type":"ContainerDied","Data":"d7fc7e7846610fba509f07151486bfd4ed6ef24eea59a148ee762be863eed91d"} Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.571453 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" event={"ID":"9639b151-8302-489d-bfc4-dc8d2e371363","Type":"ContainerStarted","Data":"a5b62668e0951cd2ea06999f3960332f151cc00f2fa1fdd052dd1308a038822f"} Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.573486 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" event={"ID":"bb7d7281-96c5-4a7d-b57f-769fbafba858","Type":"ContainerStarted","Data":"17d8224fd5e1c214f91d489f4b4d53e8755766f76af76a8b69c070345de98785"} Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.573548 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" event={"ID":"bb7d7281-96c5-4a7d-b57f-769fbafba858","Type":"ContainerStarted","Data":"824ea4dcc829ba1d015144f5c2015d0b6ff63b3594fb6ff7a71ce56551204f17"} Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.573673 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.577117 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6" event={"ID":"e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3","Type":"ContainerStarted","Data":"a34e62a39f81afcf22b3634187e4318fd53ff4c38de699ece874228ad19dce4e"} Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.577149 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6" event={"ID":"e7bc76e0-8633-45d1-9eb6-d2d256a5e4f3","Type":"ContainerStarted","Data":"4d929a48f867f109bf894ca6745946ec349d5882f4fb01563a4e8cd9e678cc0b"} Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.581135 4913 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-vwnnv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.581402 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" podUID="bb7d7281-96c5-4a7d-b57f-769fbafba858" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.596980 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jgpzx"] Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.597537 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm"] Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.631021 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.631378 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:11.131357964 +0000 UTC m=+143.034833542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:10 crc kubenswrapper[4913]: W1001 12:40:10.675924 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod088f5c34_691a_4adb_95f8_46052ba7241a.slice/crio-130a131f623f00c13a605b384b7095b8fac1139214966aa97b869b9bfd3eaf3b WatchSource:0}: Error finding container 130a131f623f00c13a605b384b7095b8fac1139214966aa97b869b9bfd3eaf3b: Status 404 returned error can't find the container with id 130a131f623f00c13a605b384b7095b8fac1139214966aa97b869b9bfd3eaf3b Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.695932 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.713883 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.720354 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz" Oct 01 12:40:10 crc kubenswrapper[4913]: W1001 12:40:10.727902 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cfbc937_568e_42d3_9d14_6dec309b3eed.slice/crio-b29bfe6a1192a3886a9d9cc7a8b323994ca00a92f130f2a8b38669a4a6e6c117 WatchSource:0}: Error finding container b29bfe6a1192a3886a9d9cc7a8b323994ca00a92f130f2a8b38669a4a6e6c117: Status 404 returned error can't find the container with id b29bfe6a1192a3886a9d9cc7a8b323994ca00a92f130f2a8b38669a4a6e6c117 Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.732384 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.732539 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:11.232500539 +0000 UTC m=+143.135976117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.732702 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.736150 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:11.236139039 +0000 UTC m=+143.139614617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.754914 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.778132 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrp62" Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.797660 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx"] Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.802252 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5"] Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.834189 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.839314 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:11.33529497 +0000 UTC m=+143.238770548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.839565 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.839894 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:11.339887205 +0000 UTC m=+143.243362783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.845349 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m"] Oct 01 12:40:10 crc kubenswrapper[4913]: I1001 12:40:10.940544 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:10 crc kubenswrapper[4913]: E1001 12:40:10.941409 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:11.441393491 +0000 UTC m=+143.344869069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.011210 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4fjzp"] Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.023918 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p2n7t"] Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.042334 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:11 crc kubenswrapper[4913]: E1001 12:40:11.042834 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:11.542817384 +0000 UTC m=+143.446292962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.143583 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:11 crc kubenswrapper[4913]: E1001 12:40:11.143865 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:11.643838937 +0000 UTC m=+143.547314515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.144007 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:11 crc kubenswrapper[4913]: E1001 12:40:11.144307 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:11.644295769 +0000 UTC m=+143.547771347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:11 crc kubenswrapper[4913]: W1001 12:40:11.166600 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab2d81fe_9ab9_4630_b7ba_7ffe7ee6daea.slice/crio-7bdd6f601de12a3ec68422cadb8ea7bbe552c87dc03f1ba1646c50ffc9c1093f WatchSource:0}: Error finding container 7bdd6f601de12a3ec68422cadb8ea7bbe552c87dc03f1ba1646c50ffc9c1093f: Status 404 returned error can't find the container with id 7bdd6f601de12a3ec68422cadb8ea7bbe552c87dc03f1ba1646c50ffc9c1093f Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.245441 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:11 crc kubenswrapper[4913]: E1001 12:40:11.245815 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:11.745797725 +0000 UTC m=+143.649273313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.350563 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:11 crc kubenswrapper[4913]: E1001 12:40:11.350855 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:11.850842467 +0000 UTC m=+143.754318045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.414967 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hxvqt" podStartSLOduration=122.414951337 podStartE2EDuration="2m2.414951337s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:11.382385663 +0000 UTC m=+143.285861271" watchObservedRunningTime="2025-10-01 12:40:11.414951337 +0000 UTC m=+143.318426915" Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.451256 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:11 crc kubenswrapper[4913]: E1001 12:40:11.451589 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:11.951551431 +0000 UTC m=+143.855027039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.451814 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:11 crc kubenswrapper[4913]: E1001 12:40:11.452119 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:11.952105966 +0000 UTC m=+143.855581544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.553899 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:11 crc kubenswrapper[4913]: E1001 12:40:11.554321 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:12.05430274 +0000 UTC m=+143.957778338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.586790 4913 generic.go:334] "Generic (PLEG): container finished" podID="9639b151-8302-489d-bfc4-dc8d2e371363" containerID="5eaf27a6d2d52e38e8b2ab1319e370f7c43e2d5c914e20edc14b121fcb2e4d13" exitCode=0 Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.587113 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" event={"ID":"9639b151-8302-489d-bfc4-dc8d2e371363","Type":"ContainerDied","Data":"5eaf27a6d2d52e38e8b2ab1319e370f7c43e2d5c914e20edc14b121fcb2e4d13"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.589553 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jk8wn" event={"ID":"088bf733-26d8-4478-b6d8-346657f863ac","Type":"ContainerStarted","Data":"458662407c68403210d1590b6765ac66a729e0ea6843db0b03110903e9abf2e6"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.602102 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-f5nlk" event={"ID":"b05c3501-40da-4059-b7c7-5e3a7bd66ab1","Type":"ContainerStarted","Data":"24f607f0266ad843378952c9574b59a0e4e17b19929b2f27b713f4c389d3ad23"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.606109 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p2n7t" event={"ID":"ade544d3-8b0e-43f3-b2c8-cbebd21f0405","Type":"ContainerStarted","Data":"9703b745b280c6152f6f7b35ffff007a310899ef4c31e7e45b977dbfa0f20f9c"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.632148 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-97mb9" event={"ID":"4d2bd20a-3d8d-4073-aca4-ceca547c186f","Type":"ContainerStarted","Data":"64f6a126c69e13d68237720011f9ea9198b3ff8b559d4d6eff1d672c54f78ecd"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.632207 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-97mb9" event={"ID":"4d2bd20a-3d8d-4073-aca4-ceca547c186f","Type":"ContainerStarted","Data":"8005f4076373b0a70b9c49a30a9196586722949640608da74483ba5127dd970f"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.668163 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:11 crc kubenswrapper[4913]: E1001 12:40:11.668511 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:12.16849975 +0000 UTC m=+144.071975328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.750234 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5" event={"ID":"ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2","Type":"ContainerStarted","Data":"86d7783c56c9f18f6dfe802eaf45371792a25e7730afb2382e6380ca623aa4fc"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.751448 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx"] Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.777556 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:11 crc kubenswrapper[4913]: E1001 12:40:11.778608 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:12.27859333 +0000 UTC m=+144.182068908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.779924 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q"] Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.788381 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc"] Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.799827 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rswcz"] Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.821240 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" event={"ID":"619c15e7-db7d-4cbb-af79-5e52468bfc1a","Type":"ContainerStarted","Data":"22e1618c3efa834faa63e3fec4b98c37a3e17fbe733433f3fda5fc0f006bf427"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.859932 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fm7mq"] Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.861566 4913 generic.go:334] "Generic (PLEG): container finished" podID="088f5c34-691a-4adb-95f8-46052ba7241a" containerID="72e6c90ee07be32acf236178470f333d2194b7ea02cb9a1d59a0978799bf0975" exitCode=0 Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.861803 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" event={"ID":"088f5c34-691a-4adb-95f8-46052ba7241a","Type":"ContainerDied","Data":"72e6c90ee07be32acf236178470f333d2194b7ea02cb9a1d59a0978799bf0975"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.861831 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" event={"ID":"088f5c34-691a-4adb-95f8-46052ba7241a","Type":"ContainerStarted","Data":"130a131f623f00c13a605b384b7095b8fac1139214966aa97b869b9bfd3eaf3b"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.880507 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgpzx" event={"ID":"36e61ca6-b668-430c-9e68-00a308d192f0","Type":"ContainerStarted","Data":"35d7ce42d6ee8024690f5976bef47381d75729cadf0d93896d84f4fc5ecef85d"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.885773 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4fts" podStartSLOduration=122.885756058 podStartE2EDuration="2m2.885756058s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:11.87475077 +0000 UTC m=+143.778226348" watchObservedRunningTime="2025-10-01 12:40:11.885756058 +0000 UTC m=+143.789231636" Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.887339 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ssczm"] Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.887999 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:11 crc kubenswrapper[4913]: E1001 12:40:11.888388 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:12.38837586 +0000 UTC m=+144.291851438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.906651 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" event={"ID":"3cfbc937-568e-42d3-9d14-6dec309b3eed","Type":"ContainerStarted","Data":"b29bfe6a1192a3886a9d9cc7a8b323994ca00a92f130f2a8b38669a4a6e6c117"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.911797 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx" event={"ID":"95688848-6ae6-4abe-a83a-0b43899c2b81","Type":"ContainerStarted","Data":"23f1473b1e91890ebaffed78e5a484975404df908cc314274f50cbf894f9d66f"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.912926 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" event={"ID":"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea","Type":"ContainerStarted","Data":"7bdd6f601de12a3ec68422cadb8ea7bbe552c87dc03f1ba1646c50ffc9c1093f"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.917071 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" event={"ID":"edb0a98c-923f-42cf-9f95-fbb4eabc8c73","Type":"ContainerStarted","Data":"fe6a457ffc1ab2b0306f7bf980c6bad15cffe24b348ca7af1bc6c8a777e029d2"} Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.929522 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.935634 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5vwx6" podStartSLOduration=122.935615742 podStartE2EDuration="2m2.935615742s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:11.92189233 +0000 UTC m=+143.825367918" watchObservedRunningTime="2025-10-01 12:40:11.935615742 +0000 UTC m=+143.839091320" Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.937416 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7f575"] Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.948608 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.956202 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" podStartSLOduration=122.95618589 podStartE2EDuration="2m2.95618589s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:11.955657766 +0000 UTC m=+143.859133364" watchObservedRunningTime="2025-10-01 12:40:11.95618589 +0000 UTC m=+143.859661458" Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.967986 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh"] Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.978853 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" podStartSLOduration=122.978839625 podStartE2EDuration="2m2.978839625s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:11.976617795 +0000 UTC m=+143.880093383" watchObservedRunningTime="2025-10-01 12:40:11.978839625 +0000 UTC m=+143.882315203" Oct 01 12:40:11 crc kubenswrapper[4913]: I1001 12:40:11.993692 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:11 crc kubenswrapper[4913]: W1001 12:40:11.994829 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3abdc8ed_965f_4219_9a7c_f18b65448445.slice/crio-1c93c4df5cce252a5936ebaac1c5004a60a2c6a2cd6e8383fd24cab2d77d9d89 WatchSource:0}: Error finding container 1c93c4df5cce252a5936ebaac1c5004a60a2c6a2cd6e8383fd24cab2d77d9d89: Status 404 returned error can't find the container with id 1c93c4df5cce252a5936ebaac1c5004a60a2c6a2cd6e8383fd24cab2d77d9d89 Oct 01 12:40:11 crc kubenswrapper[4913]: E1001 12:40:11.994948 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:12.494927372 +0000 UTC m=+144.398402950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.097180 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:12 crc kubenswrapper[4913]: E1001 12:40:12.099319 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:12.599306576 +0000 UTC m=+144.502782154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.104160 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" podStartSLOduration=123.104141427 podStartE2EDuration="2m3.104141427s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:12.103998383 +0000 UTC m=+144.007473961" watchObservedRunningTime="2025-10-01 12:40:12.104141427 +0000 UTC m=+144.007617005" Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.138208 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkgdg"] Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.141244 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr"] Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.141813 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z"] Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.146963 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz"] Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.169348 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5"] Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.204511 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:12 crc kubenswrapper[4913]: E1001 12:40:12.205224 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:12.705203721 +0000 UTC m=+144.608679299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.205386 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:12 crc kubenswrapper[4913]: E1001 12:40:12.205906 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:12.705874229 +0000 UTC m=+144.609349807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.230482 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tjv45"] Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.231841 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vbrpv"] Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.242114 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jrp62"] Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.306438 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:12 crc kubenswrapper[4913]: E1001 12:40:12.306577 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:12.806553432 +0000 UTC m=+144.710029010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.306856 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:12 crc kubenswrapper[4913]: E1001 12:40:12.307173 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:12.807160668 +0000 UTC m=+144.710636246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.313428 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784"] Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.321569 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-78qn2"] Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.323545 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kssl9"] Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.326799 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vqbl5"] Oct 01 12:40:12 crc kubenswrapper[4913]: W1001 12:40:12.387974 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d48f7ca_f286_49f9_8541_bc186e440dfa.slice/crio-7f784546c5053f622b735c508b61a484eb124a5a811d020d314044e9456be5fe WatchSource:0}: Error finding container 7f784546c5053f622b735c508b61a484eb124a5a811d020d314044e9456be5fe: Status 404 returned error can't find the container with id 7f784546c5053f622b735c508b61a484eb124a5a811d020d314044e9456be5fe Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.407791 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:12 crc kubenswrapper[4913]: E1001 12:40:12.408100 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:12.908081148 +0000 UTC m=+144.811556726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.427869 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59"] Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.511844 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:12 crc kubenswrapper[4913]: E1001 12:40:12.514054 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:13.014040295 +0000 UTC m=+144.917515873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.612805 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.613139 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-97mb9" podStartSLOduration=123.613119435 podStartE2EDuration="2m3.613119435s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:12.611775138 +0000 UTC m=+144.515250726" watchObservedRunningTime="2025-10-01 12:40:12.613119435 +0000 UTC m=+144.516595013" Oct 01 12:40:12 crc kubenswrapper[4913]: E1001 12:40:12.613257 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:13.113243129 +0000 UTC m=+145.016718697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.714371 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:12 crc kubenswrapper[4913]: E1001 12:40:12.714735 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:13.214719013 +0000 UTC m=+145.118194591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.821056 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:12 crc kubenswrapper[4913]: E1001 12:40:12.822156 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:13.322135269 +0000 UTC m=+145.225610837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:12 crc kubenswrapper[4913]: I1001 12:40:12.926161 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:12 crc kubenswrapper[4913]: E1001 12:40:12.926443 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:13.42643287 +0000 UTC m=+145.329908448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.002673 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" event={"ID":"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e","Type":"ContainerStarted","Data":"d125a1c5a68fd3188afd0a6ec40b023a10dd29af919113091b517bef3660e1db"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.002722 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" event={"ID":"60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e","Type":"ContainerStarted","Data":"c42773e1a1e229f16cbb04dc955999efb29a09ff96b67652dfb9afe45de892cc"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.019323 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx" event={"ID":"95688848-6ae6-4abe-a83a-0b43899c2b81","Type":"ContainerStarted","Data":"9312915c71cfec9f7c765b528b886d0053e4964376b5c801ece36521f013567d"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.027537 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:13 crc kubenswrapper[4913]: E1001 12:40:13.028467 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:13.528452661 +0000 UTC m=+145.431928239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.042402 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tjv45" event={"ID":"4d48f7ca-f286-49f9-8541-bc186e440dfa","Type":"ContainerStarted","Data":"7f784546c5053f622b735c508b61a484eb124a5a811d020d314044e9456be5fe"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.077582 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkgdg" event={"ID":"9689e1f9-5b48-47da-af2f-dc1db858196d","Type":"ContainerStarted","Data":"6e23f239aaa78668f68d759b757358112c7132c198686de9968e7c36176a447a"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.090925 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" podStartSLOduration=124.090911586 podStartE2EDuration="2m4.090911586s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.061993961 +0000 UTC m=+144.965469549" watchObservedRunningTime="2025-10-01 12:40:13.090911586 +0000 UTC m=+144.994387164" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.091374 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mvbbx" podStartSLOduration=124.091366898 podStartE2EDuration="2m4.091366898s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.090598307 +0000 UTC m=+144.994073885" watchObservedRunningTime="2025-10-01 12:40:13.091366898 +0000 UTC m=+144.994842476" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.114555 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" event={"ID":"9639b151-8302-489d-bfc4-dc8d2e371363","Type":"ContainerStarted","Data":"20a020140cdcc7dcb6b8721f1c7e353f1c9796aa9a171a8fc57db35d243b7bf8"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.118219 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jk8wn" event={"ID":"088bf733-26d8-4478-b6d8-346657f863ac","Type":"ContainerStarted","Data":"b3c240f477fff9ed13abe191b2d621890961c9be6b7c6a2bd4fed8d8db7f90a0"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.128799 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:13 crc kubenswrapper[4913]: E1001 12:40:13.129719 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:13.629704059 +0000 UTC m=+145.533179697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.130734 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" event={"ID":"ab2d81fe-9ab9-4630-b7ba-7ffe7ee6daea","Type":"ContainerStarted","Data":"34564af4f2dfb702092edbde42fb3a4fe5732fbbef99b2e51b99f886b64bc363"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.142781 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz" event={"ID":"9664e6b0-4a34-483f-b219-9c7e7cc6d37b","Type":"ContainerStarted","Data":"544dd714a8aac93222d285c53c43243ec9366415c2e6000e116acf403ecdb782"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.173242 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kssl9" event={"ID":"8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3","Type":"ContainerStarted","Data":"916be3289df965b4afefa11b47721b06a3109c32fc79556d573c7af9e4962309"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.193468 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-f5nlk" event={"ID":"b05c3501-40da-4059-b7c7-5e3a7bd66ab1","Type":"ContainerStarted","Data":"df1e7610347acf284b142e80250996b74349ba1f0d9129fd7db94828d948a698"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.208370 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784" event={"ID":"48dc2da8-f187-4dde-8dd3-51f29b49c80a","Type":"ContainerStarted","Data":"97cc77f13e000caafb2b9dc1832674ddfe8771d0ea304eb5a4f320da4caf356b"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.217218 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4fjzp" podStartSLOduration=124.217199644 podStartE2EDuration="2m4.217199644s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.2159211 +0000 UTC m=+145.119396688" watchObservedRunningTime="2025-10-01 12:40:13.217199644 +0000 UTC m=+145.120675212" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.220041 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" podStartSLOduration=124.220033481 podStartE2EDuration="2m4.220033481s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.181584638 +0000 UTC m=+145.085060216" watchObservedRunningTime="2025-10-01 12:40:13.220033481 +0000 UTC m=+145.123509059" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.231957 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:13 crc kubenswrapper[4913]: E1001 12:40:13.232060 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:13.732041048 +0000 UTC m=+145.635516626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.232506 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:13 crc kubenswrapper[4913]: E1001 12:40:13.235594 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:13.735583464 +0000 UTC m=+145.639059042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.255415 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jk8wn" podStartSLOduration=124.255401332 podStartE2EDuration="2m4.255401332s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.253639444 +0000 UTC m=+145.157115022" watchObservedRunningTime="2025-10-01 12:40:13.255401332 +0000 UTC m=+145.158876910" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.279940 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" event={"ID":"e6fd5584-6878-4be8-83bb-f61003df2639","Type":"ContainerStarted","Data":"1137a411f44765a5a75fe26e2e6736875a8cc4f16724a24f4b6b5191f717df47"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.280239 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" event={"ID":"e6fd5584-6878-4be8-83bb-f61003df2639","Type":"ContainerStarted","Data":"acbc30e033e995bbb1deaf0de0d3d99f57a80ea62aed466d6e698903c3267bfa"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.288066 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgpzx" event={"ID":"36e61ca6-b668-430c-9e68-00a308d192f0","Type":"ContainerStarted","Data":"a01189110eec61f506fdb31107ddcd4849c6c0f33c720d0cfc0edeb3d4fe2dda"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.288109 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgpzx" event={"ID":"36e61ca6-b668-430c-9e68-00a308d192f0","Type":"ContainerStarted","Data":"05c9ed0eb9e93d35c666f8bbe453e0f3794a8298a1fa7eb553ce8bd6ed6fbfb3"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.290056 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" event={"ID":"3542249d-1f2f-4814-8d0b-ba8b664f48d7","Type":"ContainerStarted","Data":"16b7ac38c8803f276e2722dddee9445ba6345094752ab4244968258d3e49446d"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.301790 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-kssl9" podStartSLOduration=124.3017731 podStartE2EDuration="2m4.3017731s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.272501416 +0000 UTC m=+145.175977004" watchObservedRunningTime="2025-10-01 12:40:13.3017731 +0000 UTC m=+145.205248678" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.305154 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-f5nlk" podStartSLOduration=6.305123732 podStartE2EDuration="6.305123732s" podCreationTimestamp="2025-10-01 12:40:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.304904176 +0000 UTC m=+145.208379764" watchObservedRunningTime="2025-10-01 12:40:13.305123732 +0000 UTC m=+145.208599310" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.321039 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q" event={"ID":"33c9a78b-be05-4c57-b51a-11da91ed2503","Type":"ContainerStarted","Data":"2e24c8ca081163073e6385899a5309a8dc73b04b806f1012ba5c39072b5a47b9"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.321100 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q" event={"ID":"33c9a78b-be05-4c57-b51a-11da91ed2503","Type":"ContainerStarted","Data":"4407b22b01ef3efb44785b351e97ab2c38ad0581140332caf0e2dc4589fcb81b"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.328709 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-78qn2" event={"ID":"42df2d5b-cf53-4367-8f93-a231a07cd44e","Type":"ContainerStarted","Data":"83485669e351beb7c1b9f975a8d85a921b279b55d5df5dc0f0e31e17c6dd75fb"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.334133 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:13 crc kubenswrapper[4913]: E1001 12:40:13.335047 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:13.835032293 +0000 UTC m=+145.738507871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.355859 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" podStartSLOduration=124.355840739 podStartE2EDuration="2m4.355840739s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.354915963 +0000 UTC m=+145.258391561" watchObservedRunningTime="2025-10-01 12:40:13.355840739 +0000 UTC m=+145.259316317" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.360737 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5" event={"ID":"ee64ed7c-bf28-4c04-a5e3-b9c241ed8ad2","Type":"ContainerStarted","Data":"10562639ede8c4ea38a68a80213b8accab2e320f1d73aeb5e49649ebca8f9893"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.384714 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p2n7t" event={"ID":"ade544d3-8b0e-43f3-b2c8-cbebd21f0405","Type":"ContainerStarted","Data":"0c3a17d7c1cf6ccbe65de3f9784a5643046a28004d6491d4096269a86a20eb6a"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.407890 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ssczm" event={"ID":"a60b32b5-e9f3-4dd1-be69-05d4ec0789fe","Type":"ContainerStarted","Data":"3fadbfafcbb7900975b412cc8cafd7237ec21c79c6610b49d3f957cc6e6fb0a3"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.407938 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ssczm" event={"ID":"a60b32b5-e9f3-4dd1-be69-05d4ec0789fe","Type":"ContainerStarted","Data":"5bd72b576528f4b94122336c3b2189a3b635be16130e172c367d943af6a90fee"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.409159 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ssczm" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.424467 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5gpp5" podStartSLOduration=124.424448441 podStartE2EDuration="2m4.424448441s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.423384292 +0000 UTC m=+145.326859890" watchObservedRunningTime="2025-10-01 12:40:13.424448441 +0000 UTC m=+145.327924019" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.425878 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgpzx" podStartSLOduration=124.42587186 podStartE2EDuration="2m4.42587186s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.395888246 +0000 UTC m=+145.299363834" watchObservedRunningTime="2025-10-01 12:40:13.42587186 +0000 UTC m=+145.329347438" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.435364 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.437635 4913 patch_prober.go:28] interesting pod/downloads-7954f5f757-ssczm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.437674 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ssczm" podUID="a60b32b5-e9f3-4dd1-be69-05d4ec0789fe" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Oct 01 12:40:13 crc kubenswrapper[4913]: E1001 12:40:13.439251 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:13.939240693 +0000 UTC m=+145.842716271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.460606 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.468069 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" event={"ID":"edb0a98c-923f-42cf-9f95-fbb4eabc8c73","Type":"ContainerStarted","Data":"f4980750dc94f6af3412c78ea347affc49c6d065bc1c56feca8c291c73320f15"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.468116 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" event={"ID":"edb0a98c-923f-42cf-9f95-fbb4eabc8c73","Type":"ContainerStarted","Data":"de6e68a94236beca570ced1712d79d7ba25a7b1894b49f5c286761d021939dc0"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.473784 4913 patch_prober.go:28] interesting pod/router-default-5444994796-jk8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:40:13 crc kubenswrapper[4913]: [-]has-synced failed: reason withheld Oct 01 12:40:13 crc kubenswrapper[4913]: [+]process-running ok Oct 01 12:40:13 crc kubenswrapper[4913]: healthz check failed Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.473838 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jk8wn" podUID="088bf733-26d8-4478-b6d8-346657f863ac" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.480404 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.502504 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zcf4q" podStartSLOduration=124.502481269 podStartE2EDuration="2m4.502481269s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.494942504 +0000 UTC m=+145.398418082" watchObservedRunningTime="2025-10-01 12:40:13.502481269 +0000 UTC m=+145.405956837" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.503190 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-p2n7t" podStartSLOduration=124.503186328 podStartE2EDuration="2m4.503186328s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.453177951 +0000 UTC m=+145.356653539" watchObservedRunningTime="2025-10-01 12:40:13.503186328 +0000 UTC m=+145.406661906" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.507949 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7f575" event={"ID":"3abdc8ed-965f-4219-9a7c-f18b65448445","Type":"ContainerStarted","Data":"1c93c4df5cce252a5936ebaac1c5004a60a2c6a2cd6e8383fd24cab2d77d9d89"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.508638 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.521570 4913 patch_prober.go:28] interesting pod/console-operator-58897d9998-7f575 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.521623 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-7f575" podUID="3abdc8ed-965f-4219-9a7c-f18b65448445" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.530170 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" event={"ID":"ceea773f-549c-4d23-841c-a8e2ccb62f28","Type":"ContainerStarted","Data":"5e266b525fb8c5a8e305bf7d615da96d3df3314f50864d77660241297e14c910"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.538767 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:13 crc kubenswrapper[4913]: E1001 12:40:13.538898 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:14.038850057 +0000 UTC m=+145.942325635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.538945 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:13 crc kubenswrapper[4913]: E1001 12:40:13.541533 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:14.041511929 +0000 UTC m=+145.944987507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.567687 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" podStartSLOduration=124.567671519 podStartE2EDuration="2m4.567671519s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.561286066 +0000 UTC m=+145.464761664" watchObservedRunningTime="2025-10-01 12:40:13.567671519 +0000 UTC m=+145.471147087" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.617543 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" event={"ID":"9cc74c00-b3a6-422c-8ad4-1547e0834e3f","Type":"ContainerStarted","Data":"ee8ffb6f63a5d8b994d96b10f8caefb3537d269bd02ace565898196d86a1f3e6"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.617618 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" event={"ID":"9cc74c00-b3a6-422c-8ad4-1547e0834e3f","Type":"ContainerStarted","Data":"e6064e265142dcf86f65d687e72de2c5a4ffa5473d683125a72a88e264c3607f"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.618472 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.619075 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sg85m" podStartSLOduration=124.619059915 podStartE2EDuration="2m4.619059915s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.618123029 +0000 UTC m=+145.521598637" watchObservedRunningTime="2025-10-01 12:40:13.619059915 +0000 UTC m=+145.522535493" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.645950 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:13 crc kubenswrapper[4913]: E1001 12:40:13.646249 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:14.146225322 +0000 UTC m=+146.049700900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.661571 4913 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-5twnx container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.661626 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" podUID="9cc74c00-b3a6-422c-8ad4-1547e0834e3f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.674181 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" event={"ID":"dff82d10-6428-408e-be1f-15df477faac8","Type":"ContainerStarted","Data":"a3ecd572bc722452e670f30c77c435957cd2e953f8abda8926dd27e2f4b093b1"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.674224 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" event={"ID":"dff82d10-6428-408e-be1f-15df477faac8","Type":"ContainerStarted","Data":"cabb3821e000476b9d4cf30ebe16c6606f3c32db636ed503c571f7635827e09f"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.704532 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" event={"ID":"eb64fb3e-2ed2-469a-8278-13be858098a1","Type":"ContainerStarted","Data":"7b22cf1ed53891d76f0451111866f24186019b99eed95fd27c3923c64f885db1"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.704659 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" event={"ID":"eb64fb3e-2ed2-469a-8278-13be858098a1","Type":"ContainerStarted","Data":"e60c89ac8a8c022d430f6ce98cf08c6f7d480c156d86d074853012c49c1a797e"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.715449 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ssczm" podStartSLOduration=124.715433761 podStartE2EDuration="2m4.715433761s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.690169035 +0000 UTC m=+145.593644613" watchObservedRunningTime="2025-10-01 12:40:13.715433761 +0000 UTC m=+145.618909339" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.746893 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:13 crc kubenswrapper[4913]: E1001 12:40:13.747893 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:14.247880852 +0000 UTC m=+146.151356430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.748703 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" event={"ID":"c6808721-c324-44b0-af75-7d438cc0d713","Type":"ContainerStarted","Data":"533941f04f56c414afc1b4102c7c2ce8488ca8b52a83e1beb357a6b02c764ace"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.748752 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" event={"ID":"c6808721-c324-44b0-af75-7d438cc0d713","Type":"ContainerStarted","Data":"c0688f0cdd14202c091d15ef630a291d12a1912588523fe71da09aa112356069"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.765673 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" event={"ID":"1b35345f-132f-4890-a37d-e0dab7f975b7","Type":"ContainerStarted","Data":"79b30fe1b4a13cde39491d08526c68c1004775788a0f14e77c1de0c8de0dd0be"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.766226 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.767083 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" podStartSLOduration=124.767068033 podStartE2EDuration="2m4.767068033s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.758998844 +0000 UTC m=+145.662474442" watchObservedRunningTime="2025-10-01 12:40:13.767068033 +0000 UTC m=+145.670543611" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.767238 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-7f575" podStartSLOduration=124.767232717 podStartE2EDuration="2m4.767232717s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.721435503 +0000 UTC m=+145.624911091" watchObservedRunningTime="2025-10-01 12:40:13.767232717 +0000 UTC m=+145.670708295" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.775282 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" event={"ID":"3cfbc937-568e-42d3-9d14-6dec309b3eed","Type":"ContainerStarted","Data":"897746406b65ab898e0edfd56468f61c1de5a5298ef727123d57a005047fbfc1"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.780795 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrp62" event={"ID":"dbbcc58c-99fe-4c00-bc81-64d399027e66","Type":"ContainerStarted","Data":"98cca8f1e2a72b3bfaa48729bd7943277d98e9391fe23c043e0a7bcb3fe9e604"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.784322 4913 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-r5r2z container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.784494 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" podUID="1b35345f-132f-4890-a37d-e0dab7f975b7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.787931 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-rswcz" podStartSLOduration=124.787917039 podStartE2EDuration="2m4.787917039s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.786469999 +0000 UTC m=+145.689945597" watchObservedRunningTime="2025-10-01 12:40:13.787917039 +0000 UTC m=+145.691392617" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.831158 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5" event={"ID":"7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad","Type":"ContainerStarted","Data":"74d153e8bbe41a284e76d514c32878c8a96d088e12a45d8c7c52bba18ed17f4c"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.851860 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" event={"ID":"619c15e7-db7d-4cbb-af79-5e52468bfc1a","Type":"ContainerStarted","Data":"fce910b6fe6a1b1de91005b6522c4aa8831903209fdf76aab4e28a48f159c0ad"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.853428 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:13 crc kubenswrapper[4913]: E1001 12:40:13.854169 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:14.354152447 +0000 UTC m=+146.257628085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.854396 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x5dnm" podStartSLOduration=124.854379963 podStartE2EDuration="2m4.854379963s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.821453149 +0000 UTC m=+145.724928747" watchObservedRunningTime="2025-10-01 12:40:13.854379963 +0000 UTC m=+145.757855541" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.891137 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" podStartSLOduration=124.891118241 podStartE2EDuration="2m4.891118241s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.853575801 +0000 UTC m=+145.757051379" watchObservedRunningTime="2025-10-01 12:40:13.891118241 +0000 UTC m=+145.794593819" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.891288 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" event={"ID":"92b2bff6-3b61-4d3b-8d88-9077b02ed990","Type":"ContainerStarted","Data":"ba3c9c42ad30ccc703ac718a7b53ba35284b8767f8a4decf49ffeb7f046978cf"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.891565 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" event={"ID":"92b2bff6-3b61-4d3b-8d88-9077b02ed990","Type":"ContainerStarted","Data":"aa8ad35a9e4ea46c3789ffb91ce5094cd308bebff06483d4daee901dacd4b11d"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.907903 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vqbl5" event={"ID":"61fd05a4-3c42-4fe4-8bce-486b538bab4c","Type":"ContainerStarted","Data":"8722f5a87376546fca75fa71f01e807cf4246b45d0a019d6e6c8f74a4a25e189"} Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.933347 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4r4x" podStartSLOduration=124.933331416 podStartE2EDuration="2m4.933331416s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.891542212 +0000 UTC m=+145.795017820" watchObservedRunningTime="2025-10-01 12:40:13.933331416 +0000 UTC m=+145.836806994" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.933429 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" podStartSLOduration=124.933423989 podStartE2EDuration="2m4.933423989s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:13.931757013 +0000 UTC m=+145.835232621" watchObservedRunningTime="2025-10-01 12:40:13.933423989 +0000 UTC m=+145.836899567" Oct 01 12:40:13 crc kubenswrapper[4913]: I1001 12:40:13.957136 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:13 crc kubenswrapper[4913]: E1001 12:40:13.959574 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:14.459563108 +0000 UTC m=+146.363038686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.060022 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:14 crc kubenswrapper[4913]: E1001 12:40:14.060401 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:14.560387255 +0000 UTC m=+146.463862833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.161084 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:14 crc kubenswrapper[4913]: E1001 12:40:14.161484 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:14.661467119 +0000 UTC m=+146.564942697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.191531 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.191590 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.200376 4913 patch_prober.go:28] interesting pod/apiserver-76f77b778f-wp7l9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.200451 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" podUID="60e0be04-076b-4b5b-8c2c-6fe5a7e2c49e" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.261581 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:14 crc kubenswrapper[4913]: E1001 12:40:14.262061 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:14.76204725 +0000 UTC m=+146.665522828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.362889 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:14 crc kubenswrapper[4913]: E1001 12:40:14.363179 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:14.863167075 +0000 UTC m=+146.766642653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.462243 4913 patch_prober.go:28] interesting pod/router-default-5444994796-jk8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:40:14 crc kubenswrapper[4913]: [-]has-synced failed: reason withheld Oct 01 12:40:14 crc kubenswrapper[4913]: [+]process-running ok Oct 01 12:40:14 crc kubenswrapper[4913]: healthz check failed Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.462352 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jk8wn" podUID="088bf733-26d8-4478-b6d8-346657f863ac" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.463754 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:14 crc kubenswrapper[4913]: E1001 12:40:14.463929 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:14.96391025 +0000 UTC m=+146.867385828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.565308 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:14 crc kubenswrapper[4913]: E1001 12:40:14.565739 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.065728135 +0000 UTC m=+146.969203713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.666113 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:14 crc kubenswrapper[4913]: E1001 12:40:14.666239 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.166218153 +0000 UTC m=+147.069693731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.666326 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:14 crc kubenswrapper[4913]: E1001 12:40:14.666602 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.166593083 +0000 UTC m=+147.070068651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.766675 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:14 crc kubenswrapper[4913]: E1001 12:40:14.766806 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.266789862 +0000 UTC m=+147.170265440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.766883 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:14 crc kubenswrapper[4913]: E1001 12:40:14.767163 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.267153263 +0000 UTC m=+147.170628841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.849043 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.849096 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.867440 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:14 crc kubenswrapper[4913]: E1001 12:40:14.867989 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.36797453 +0000 UTC m=+147.271450108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.873664 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.926526 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" event={"ID":"3542249d-1f2f-4814-8d0b-ba8b664f48d7","Type":"ContainerStarted","Data":"301d272799e8679733286351694a44f2c6ee00cb274b98241fa6fc7eb683dd60"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.926761 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.928689 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5" event={"ID":"7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad","Type":"ContainerStarted","Data":"29b73817fdc0879ccd6f5a04c1fb87cadb2c152ef758ab171cdf9fac77910fba"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.928721 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5" event={"ID":"7e6e26d4-dc47-4b8d-9af9-c2c8b10728ad","Type":"ContainerStarted","Data":"16cb173612637b8bca70a51f4479ec01b2ee3e492855edf7ded55d5a2f3cbf46"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.929058 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.929123 4913 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-g7d59 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.929149 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" podUID="3542249d-1f2f-4814-8d0b-ba8b664f48d7" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.932406 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-78qn2" event={"ID":"42df2d5b-cf53-4367-8f93-a231a07cd44e","Type":"ContainerStarted","Data":"b4dbd002c9a4b393839c19ca728f483e40922a577d9c8a412da688aee38993d0"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.935374 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrp62" event={"ID":"dbbcc58c-99fe-4c00-bc81-64d399027e66","Type":"ContainerStarted","Data":"31230ebcc5dfda131bb0672a0b67f55c359f2858217d54330b9f20a7c56e555f"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.935416 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrp62" event={"ID":"dbbcc58c-99fe-4c00-bc81-64d399027e66","Type":"ContainerStarted","Data":"695c4f3fcbd37bd7c6e536bf2e29fbffd1125e46d4319e25015a1235435059bc"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.937648 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz" event={"ID":"9664e6b0-4a34-483f-b219-9c7e7cc6d37b","Type":"ContainerStarted","Data":"cf7d97a5958f0023a94e78d3248cba406c1f0fb5bb402480e5340adb83b2124d"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.939449 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p2n7t" event={"ID":"ade544d3-8b0e-43f3-b2c8-cbebd21f0405","Type":"ContainerStarted","Data":"5459e22771be93c1029fb57734c9bb5cfab4b4d17d48f4c022b6c0f9377eb700"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.941772 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" event={"ID":"dff82d10-6428-408e-be1f-15df477faac8","Type":"ContainerStarted","Data":"546786be6db4518c866d17f19da3b3d896b064c11d6441432823f0cec15461f4"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.943628 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkgdg" event={"ID":"9689e1f9-5b48-47da-af2f-dc1db858196d","Type":"ContainerStarted","Data":"bd21b49663de5a1f6cbca5692f2cba79b427a779d1dbff763fd7c7f028c8c409"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.944697 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784" event={"ID":"48dc2da8-f187-4dde-8dd3-51f29b49c80a","Type":"ContainerStarted","Data":"6207127886bd6eb5039f0f8044c8e009d4d94a9959cff9d483192a1395013782"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.946258 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" event={"ID":"c6808721-c324-44b0-af75-7d438cc0d713","Type":"ContainerStarted","Data":"37be83a873d6a1e7241743e209d52555e76f70adf17d0bbdd8bf61a8e04a0dae"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.947597 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" event={"ID":"1b35345f-132f-4890-a37d-e0dab7f975b7","Type":"ContainerStarted","Data":"4fd6f00f28eb932d221cee2a4d39b35fda263c0a79addc3eefa7fd073ebfabe0"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.949776 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vqbl5" event={"ID":"61fd05a4-3c42-4fe4-8bce-486b538bab4c","Type":"ContainerStarted","Data":"c351b115739b370437474e002a7b4256a531697b52f2f27c4cdd42e0347469ce"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.949825 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vqbl5" event={"ID":"61fd05a4-3c42-4fe4-8bce-486b538bab4c","Type":"ContainerStarted","Data":"3a876424644f12f81addd8462fca15f4be11cc3a46bdb7b419ee7a9d2c69767f"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.949975 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-vqbl5" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.950804 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tjv45" event={"ID":"4d48f7ca-f286-49f9-8541-bc186e440dfa","Type":"ContainerStarted","Data":"063ed40d463e066b7fc2ece6e0cc19c2c32c18c9e42ecc6605885b272c794cb0"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.952643 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" event={"ID":"ceea773f-549c-4d23-841c-a8e2ccb62f28","Type":"ContainerStarted","Data":"7c88b94b7679abe8863128d1e4ffedd7d0c79648ad02c0fee0c5a243316e4b86"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.953297 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.954349 4913 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vbrpv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.954382 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" podUID="ceea773f-549c-4d23-841c-a8e2ccb62f28" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.954587 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" podStartSLOduration=125.954349354 podStartE2EDuration="2m5.954349354s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:14.950193672 +0000 UTC m=+146.853669260" watchObservedRunningTime="2025-10-01 12:40:14.954349354 +0000 UTC m=+146.857824942" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.955364 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fm7mq" event={"ID":"92b2bff6-3b61-4d3b-8d88-9077b02ed990","Type":"ContainerStarted","Data":"325068699af2a1f922cfb2e3059a6a8724e829b125134e260a62a5018e42871e"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.960168 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" event={"ID":"088f5c34-691a-4adb-95f8-46052ba7241a","Type":"ContainerStarted","Data":"c7f794c3847b5a24993ec23ff231d7adc5eca713152bae3cf54b38c8f4710669"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.961455 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kssl9" event={"ID":"8c0846a0-bb2e-4f5f-bbcd-b9a4e8e81af3","Type":"ContainerStarted","Data":"fb48a1a296c4b8c1a399d2b185b68de6f597d33bac67012a53f120868b794a93"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.969354 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7f575" event={"ID":"3abdc8ed-965f-4219-9a7c-f18b65448445","Type":"ContainerStarted","Data":"f8934e19eba5af1721043351be26a08d4b6e54b711552be21da181ae210b5ac6"} Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.970069 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.972303 4913 patch_prober.go:28] interesting pod/downloads-7954f5f757-ssczm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.972432 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ssczm" podUID="a60b32b5-e9f3-4dd1-be69-05d4ec0789fe" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.972486 4913 patch_prober.go:28] interesting pod/console-operator-58897d9998-7f575 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.972602 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-7f575" podUID="3abdc8ed-965f-4219-9a7c-f18b65448445" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" Oct 01 12:40:14 crc kubenswrapper[4913]: E1001 12:40:14.974173 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.474161092 +0000 UTC m=+147.377636670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.979992 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-75z6d" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.986356 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5twnx" Oct 01 12:40:14 crc kubenswrapper[4913]: I1001 12:40:14.995049 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrp62" podStartSLOduration=125.995033539 podStartE2EDuration="2m5.995033539s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:14.993301502 +0000 UTC m=+146.896777110" watchObservedRunningTime="2025-10-01 12:40:14.995033539 +0000 UTC m=+146.898509117" Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.030998 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z4xlz" podStartSLOduration=126.030983215 podStartE2EDuration="2m6.030983215s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:15.028139968 +0000 UTC m=+146.931615556" watchObservedRunningTime="2025-10-01 12:40:15.030983215 +0000 UTC m=+146.934458793" Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.053217 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vf784" podStartSLOduration=126.053202328 podStartE2EDuration="2m6.053202328s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:15.05290239 +0000 UTC m=+146.956377988" watchObservedRunningTime="2025-10-01 12:40:15.053202328 +0000 UTC m=+146.956677916" Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.074284 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:15 crc kubenswrapper[4913]: E1001 12:40:15.075202 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.575172005 +0000 UTC m=+147.478647583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.075866 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:15 crc kubenswrapper[4913]: E1001 12:40:15.079573 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.579559224 +0000 UTC m=+147.483034802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.102960 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5fqgh" podStartSLOduration=126.102945438 podStartE2EDuration="2m6.102945438s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:15.100752629 +0000 UTC m=+147.004228237" watchObservedRunningTime="2025-10-01 12:40:15.102945438 +0000 UTC m=+147.006421006" Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.136018 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5" podStartSLOduration=126.135999176 podStartE2EDuration="2m6.135999176s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:15.135805701 +0000 UTC m=+147.039281299" watchObservedRunningTime="2025-10-01 12:40:15.135999176 +0000 UTC m=+147.039474754" Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.176657 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:15 crc kubenswrapper[4913]: E1001 12:40:15.177213 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.677196834 +0000 UTC m=+147.580672412 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.189972 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxwbr" podStartSLOduration=126.18995542 podStartE2EDuration="2m6.18995542s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:15.187148134 +0000 UTC m=+147.090623722" watchObservedRunningTime="2025-10-01 12:40:15.18995542 +0000 UTC m=+147.093430998" Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.190379 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkgdg" podStartSLOduration=126.190373212 podStartE2EDuration="2m6.190373212s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:15.15825563 +0000 UTC m=+147.061731228" watchObservedRunningTime="2025-10-01 12:40:15.190373212 +0000 UTC m=+147.093848800" Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.241872 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vqbl5" podStartSLOduration=8.24184741 podStartE2EDuration="8.24184741s" podCreationTimestamp="2025-10-01 12:40:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:15.240999387 +0000 UTC m=+147.144474985" watchObservedRunningTime="2025-10-01 12:40:15.24184741 +0000 UTC m=+147.145322988" Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.280943 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:15 crc kubenswrapper[4913]: E1001 12:40:15.281562 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.781550117 +0000 UTC m=+147.685025695 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.304790 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" podStartSLOduration=126.304773477 podStartE2EDuration="2m6.304773477s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:15.303523474 +0000 UTC m=+147.206999072" watchObservedRunningTime="2025-10-01 12:40:15.304773477 +0000 UTC m=+147.208249055" Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.341572 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tjv45" podStartSLOduration=8.341557407 podStartE2EDuration="8.341557407s" podCreationTimestamp="2025-10-01 12:40:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:15.338069951 +0000 UTC m=+147.241545529" watchObservedRunningTime="2025-10-01 12:40:15.341557407 +0000 UTC m=+147.245032985" Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.400252 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:15 crc kubenswrapper[4913]: E1001 12:40:15.400988 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:15.900968739 +0000 UTC m=+147.804444317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.463436 4913 patch_prober.go:28] interesting pod/router-default-5444994796-jk8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:40:15 crc kubenswrapper[4913]: [-]has-synced failed: reason withheld Oct 01 12:40:15 crc kubenswrapper[4913]: [+]process-running ok Oct 01 12:40:15 crc kubenswrapper[4913]: healthz check failed Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.464346 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jk8wn" podUID="088bf733-26d8-4478-b6d8-346657f863ac" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.502121 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:15 crc kubenswrapper[4913]: E1001 12:40:15.502467 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:16.002456224 +0000 UTC m=+147.905931802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.602781 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:15 crc kubenswrapper[4913]: E1001 12:40:15.602956 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:16.102931132 +0000 UTC m=+148.006406710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.603022 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:15 crc kubenswrapper[4913]: E1001 12:40:15.603347 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:16.103339433 +0000 UTC m=+148.006815011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.703876 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:15 crc kubenswrapper[4913]: E1001 12:40:15.704086 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:16.204058527 +0000 UTC m=+148.107534105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.704286 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:15 crc kubenswrapper[4913]: E1001 12:40:15.704676 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:16.204666364 +0000 UTC m=+148.108141982 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.805578 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:15 crc kubenswrapper[4913]: E1001 12:40:15.805708 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:16.305687896 +0000 UTC m=+148.209163474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.806123 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:15 crc kubenswrapper[4913]: E1001 12:40:15.806526 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:16.306503809 +0000 UTC m=+148.209979387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.907370 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:15 crc kubenswrapper[4913]: E1001 12:40:15.907651 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:16.407615154 +0000 UTC m=+148.311090772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.948439 4913 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-r5r2z container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.948493 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" podUID="1b35345f-132f-4890-a37d-e0dab7f975b7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.973879 4913 patch_prober.go:28] interesting pod/downloads-7954f5f757-ssczm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.973931 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ssczm" podUID="a60b32b5-e9f3-4dd1-be69-05d4ec0789fe" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.974584 4913 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vbrpv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.974631 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" podUID="ceea773f-549c-4d23-841c-a8e2ccb62f28" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Oct 01 12:40:15 crc kubenswrapper[4913]: I1001 12:40:15.979113 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g7d59" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.009998 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.010661 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.010837 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.011153 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.011317 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:40:16 crc kubenswrapper[4913]: E1001 12:40:16.012323 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:16.512312396 +0000 UTC m=+148.415787974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.011163 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.014938 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-7f575" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.017170 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.027934 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.031563 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.031913 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.045398 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.073976 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nxmpm" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.084679 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qrj6h"] Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.086381 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.091799 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.094732 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrj6h"] Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.117464 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.117768 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdq95\" (UniqueName: \"kubernetes.io/projected/89fe354d-c11c-4c4f-a2c8-309d9da44911-kube-api-access-zdq95\") pod \"certified-operators-qrj6h\" (UID: \"89fe354d-c11c-4c4f-a2c8-309d9da44911\") " pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.117844 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89fe354d-c11c-4c4f-a2c8-309d9da44911-catalog-content\") pod \"certified-operators-qrj6h\" (UID: \"89fe354d-c11c-4c4f-a2c8-309d9da44911\") " pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.117918 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89fe354d-c11c-4c4f-a2c8-309d9da44911-utilities\") pod \"certified-operators-qrj6h\" (UID: \"89fe354d-c11c-4c4f-a2c8-309d9da44911\") " pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:40:16 crc kubenswrapper[4913]: E1001 12:40:16.118109 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:16.618094568 +0000 UTC m=+148.521570146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.220234 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89fe354d-c11c-4c4f-a2c8-309d9da44911-utilities\") pod \"certified-operators-qrj6h\" (UID: \"89fe354d-c11c-4c4f-a2c8-309d9da44911\") " pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.220305 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdq95\" (UniqueName: \"kubernetes.io/projected/89fe354d-c11c-4c4f-a2c8-309d9da44911-kube-api-access-zdq95\") pod \"certified-operators-qrj6h\" (UID: \"89fe354d-c11c-4c4f-a2c8-309d9da44911\") " pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.220334 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.220362 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89fe354d-c11c-4c4f-a2c8-309d9da44911-catalog-content\") pod \"certified-operators-qrj6h\" (UID: \"89fe354d-c11c-4c4f-a2c8-309d9da44911\") " pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.220774 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89fe354d-c11c-4c4f-a2c8-309d9da44911-catalog-content\") pod \"certified-operators-qrj6h\" (UID: \"89fe354d-c11c-4c4f-a2c8-309d9da44911\") " pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.220983 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89fe354d-c11c-4c4f-a2c8-309d9da44911-utilities\") pod \"certified-operators-qrj6h\" (UID: \"89fe354d-c11c-4c4f-a2c8-309d9da44911\") " pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:40:16 crc kubenswrapper[4913]: E1001 12:40:16.221474 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:16.721464054 +0000 UTC m=+148.624939632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.229150 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.252489 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdq95\" (UniqueName: \"kubernetes.io/projected/89fe354d-c11c-4c4f-a2c8-309d9da44911-kube-api-access-zdq95\") pod \"certified-operators-qrj6h\" (UID: \"89fe354d-c11c-4c4f-a2c8-309d9da44911\") " pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.273007 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4k6h5"] Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.274347 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.280628 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.287801 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4k6h5"] Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.340173 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:16 crc kubenswrapper[4913]: E1001 12:40:16.340865 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:16.840838014 +0000 UTC m=+148.744313592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.441923 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a5b3988-8324-4401-b951-3b1e3fea763a-utilities\") pod \"community-operators-4k6h5\" (UID: \"9a5b3988-8324-4401-b951-3b1e3fea763a\") " pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.441987 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.442029 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czrkx\" (UniqueName: \"kubernetes.io/projected/9a5b3988-8324-4401-b951-3b1e3fea763a-kube-api-access-czrkx\") pod \"community-operators-4k6h5\" (UID: \"9a5b3988-8324-4401-b951-3b1e3fea763a\") " pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.442054 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a5b3988-8324-4401-b951-3b1e3fea763a-catalog-content\") pod \"community-operators-4k6h5\" (UID: \"9a5b3988-8324-4401-b951-3b1e3fea763a\") " pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:16 crc kubenswrapper[4913]: E1001 12:40:16.442382 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:16.942369442 +0000 UTC m=+148.845845020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.454480 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.463694 4913 patch_prober.go:28] interesting pod/router-default-5444994796-jk8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:40:16 crc kubenswrapper[4913]: [-]has-synced failed: reason withheld Oct 01 12:40:16 crc kubenswrapper[4913]: [+]process-running ok Oct 01 12:40:16 crc kubenswrapper[4913]: healthz check failed Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.463731 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jk8wn" podUID="088bf733-26d8-4478-b6d8-346657f863ac" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.469717 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-drblq"] Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.476818 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.498340 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-drblq"] Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.543429 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:16 crc kubenswrapper[4913]: E1001 12:40:16.548861 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:17.048832662 +0000 UTC m=+148.952308240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.557780 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a5b3988-8324-4401-b951-3b1e3fea763a-utilities\") pod \"community-operators-4k6h5\" (UID: \"9a5b3988-8324-4401-b951-3b1e3fea763a\") " pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.557895 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.557959 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czrkx\" (UniqueName: \"kubernetes.io/projected/9a5b3988-8324-4401-b951-3b1e3fea763a-kube-api-access-czrkx\") pod \"community-operators-4k6h5\" (UID: \"9a5b3988-8324-4401-b951-3b1e3fea763a\") " pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.557985 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a5b3988-8324-4401-b951-3b1e3fea763a-catalog-content\") pod \"community-operators-4k6h5\" (UID: \"9a5b3988-8324-4401-b951-3b1e3fea763a\") " pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.558412 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a5b3988-8324-4401-b951-3b1e3fea763a-catalog-content\") pod \"community-operators-4k6h5\" (UID: \"9a5b3988-8324-4401-b951-3b1e3fea763a\") " pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.558575 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a5b3988-8324-4401-b951-3b1e3fea763a-utilities\") pod \"community-operators-4k6h5\" (UID: \"9a5b3988-8324-4401-b951-3b1e3fea763a\") " pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:16 crc kubenswrapper[4913]: E1001 12:40:16.560424 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:17.060402706 +0000 UTC m=+148.963878284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.582809 4913 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.588347 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czrkx\" (UniqueName: \"kubernetes.io/projected/9a5b3988-8324-4401-b951-3b1e3fea763a-kube-api-access-czrkx\") pod \"community-operators-4k6h5\" (UID: \"9a5b3988-8324-4401-b951-3b1e3fea763a\") " pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.658759 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.659213 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b4cc\" (UniqueName: \"kubernetes.io/projected/d412bdd1-98a3-4053-a4b5-c43eff851d62-kube-api-access-7b4cc\") pod \"certified-operators-drblq\" (UID: \"d412bdd1-98a3-4053-a4b5-c43eff851d62\") " pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.659353 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d412bdd1-98a3-4053-a4b5-c43eff851d62-catalog-content\") pod \"certified-operators-drblq\" (UID: \"d412bdd1-98a3-4053-a4b5-c43eff851d62\") " pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.659411 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d412bdd1-98a3-4053-a4b5-c43eff851d62-utilities\") pod \"certified-operators-drblq\" (UID: \"d412bdd1-98a3-4053-a4b5-c43eff851d62\") " pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:16 crc kubenswrapper[4913]: E1001 12:40:16.659534 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:17.159519156 +0000 UTC m=+149.062994734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.664217 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.670249 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lsvt4"] Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.675120 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.687335 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lsvt4"] Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.748923 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r5r2z" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.760440 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d412bdd1-98a3-4053-a4b5-c43eff851d62-utilities\") pod \"certified-operators-drblq\" (UID: \"d412bdd1-98a3-4053-a4b5-c43eff851d62\") " pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.760491 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-utilities\") pod \"community-operators-lsvt4\" (UID: \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\") " pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.760516 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.760537 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58q5p\" (UniqueName: \"kubernetes.io/projected/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-kube-api-access-58q5p\") pod \"community-operators-lsvt4\" (UID: \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\") " pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.760572 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b4cc\" (UniqueName: \"kubernetes.io/projected/d412bdd1-98a3-4053-a4b5-c43eff851d62-kube-api-access-7b4cc\") pod \"certified-operators-drblq\" (UID: \"d412bdd1-98a3-4053-a4b5-c43eff851d62\") " pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.760612 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d412bdd1-98a3-4053-a4b5-c43eff851d62-catalog-content\") pod \"certified-operators-drblq\" (UID: \"d412bdd1-98a3-4053-a4b5-c43eff851d62\") " pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.760636 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-catalog-content\") pod \"community-operators-lsvt4\" (UID: \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\") " pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.761025 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d412bdd1-98a3-4053-a4b5-c43eff851d62-utilities\") pod \"certified-operators-drblq\" (UID: \"d412bdd1-98a3-4053-a4b5-c43eff851d62\") " pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:16 crc kubenswrapper[4913]: E1001 12:40:16.761302 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:17.261258028 +0000 UTC m=+149.164733596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.761780 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d412bdd1-98a3-4053-a4b5-c43eff851d62-catalog-content\") pod \"certified-operators-drblq\" (UID: \"d412bdd1-98a3-4053-a4b5-c43eff851d62\") " pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.784285 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b4cc\" (UniqueName: \"kubernetes.io/projected/d412bdd1-98a3-4053-a4b5-c43eff851d62-kube-api-access-7b4cc\") pod \"certified-operators-drblq\" (UID: \"d412bdd1-98a3-4053-a4b5-c43eff851d62\") " pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.814872 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.861673 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:16 crc kubenswrapper[4913]: E1001 12:40:16.861925 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:17.3618903 +0000 UTC m=+149.265365888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.862002 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-catalog-content\") pod \"community-operators-lsvt4\" (UID: \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\") " pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.862224 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-utilities\") pod \"community-operators-lsvt4\" (UID: \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\") " pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.862297 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.862452 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58q5p\" (UniqueName: \"kubernetes.io/projected/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-kube-api-access-58q5p\") pod \"community-operators-lsvt4\" (UID: \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\") " pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:40:16 crc kubenswrapper[4913]: E1001 12:40:16.863430 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:40:17.363403362 +0000 UTC m=+149.266878950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lhq4g" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.867224 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-utilities\") pod \"community-operators-lsvt4\" (UID: \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\") " pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.869048 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-catalog-content\") pod \"community-operators-lsvt4\" (UID: \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\") " pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.870639 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrj6h"] Oct 01 12:40:16 crc kubenswrapper[4913]: W1001 12:40:16.885151 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89fe354d_c11c_4c4f_a2c8_309d9da44911.slice/crio-bf0f1d6972bdf3fcc71fbe42d537d0074b2fa0a9fc2cff4b24ed19e2b39d9906 WatchSource:0}: Error finding container bf0f1d6972bdf3fcc71fbe42d537d0074b2fa0a9fc2cff4b24ed19e2b39d9906: Status 404 returned error can't find the container with id bf0f1d6972bdf3fcc71fbe42d537d0074b2fa0a9fc2cff4b24ed19e2b39d9906 Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.886001 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58q5p\" (UniqueName: \"kubernetes.io/projected/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-kube-api-access-58q5p\") pod \"community-operators-lsvt4\" (UID: \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\") " pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.963126 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4k6h5"] Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.963861 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:16 crc kubenswrapper[4913]: E1001 12:40:16.964596 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:40:17.464563508 +0000 UTC m=+149.368039086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.982752 4913 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.982813 4913 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.979781 4913 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-01T12:40:16.582836685Z","Handler":null,"Name":""} Oct 01 12:40:16 crc kubenswrapper[4913]: W1001 12:40:16.986176 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a5b3988_8324_4401_b951_3b1e3fea763a.slice/crio-e8cb1c5e9305ccdead244dcf4e63665ca74f45b021dea1e405d277d6d0ca5a6d WatchSource:0}: Error finding container e8cb1c5e9305ccdead244dcf4e63665ca74f45b021dea1e405d277d6d0ca5a6d: Status 404 returned error can't find the container with id e8cb1c5e9305ccdead244dcf4e63665ca74f45b021dea1e405d277d6d0ca5a6d Oct 01 12:40:16 crc kubenswrapper[4913]: I1001 12:40:16.986486 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrj6h" event={"ID":"89fe354d-c11c-4c4f-a2c8-309d9da44911","Type":"ContainerStarted","Data":"bf0f1d6972bdf3fcc71fbe42d537d0074b2fa0a9fc2cff4b24ed19e2b39d9906"} Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.007084 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4441ad9a1d7317c96ecc2a7a54329da4bddc92d746c774fa6bc120e415049395"} Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.007128 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e4311eed8a5dbfd94f143584963b703562a7cce0fd5f10868a40adbf057ac171"} Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.022261 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.033965 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-78qn2" event={"ID":"42df2d5b-cf53-4367-8f93-a231a07cd44e","Type":"ContainerStarted","Data":"c18ffbd988939ecf620f389f00f250ada593575b79ef8c55fce1a19b7fec0822"} Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.033993 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-78qn2" event={"ID":"42df2d5b-cf53-4367-8f93-a231a07cd44e","Type":"ContainerStarted","Data":"1eb655202e1937ee5a0e72282e9bb211ea896a3a524b20d739ff6faba4aeb38c"} Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.034003 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-78qn2" event={"ID":"42df2d5b-cf53-4367-8f93-a231a07cd44e","Type":"ContainerStarted","Data":"6447f46494e15040554e1b1d4d996497284b4cf1c8c651a86fa1286fb7882b2e"} Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.038963 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.064968 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.070704 4913 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.070744 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.087822 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-78qn2" podStartSLOduration=10.087807234 podStartE2EDuration="10.087807234s" podCreationTimestamp="2025-10-01 12:40:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:17.08622436 +0000 UTC m=+148.989699958" watchObservedRunningTime="2025-10-01 12:40:17.087807234 +0000 UTC m=+148.991282812" Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.118561 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lhq4g\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.123999 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-drblq"] Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.165843 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.182808 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.242192 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.462851 4913 patch_prober.go:28] interesting pod/router-default-5444994796-jk8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:40:17 crc kubenswrapper[4913]: [-]has-synced failed: reason withheld Oct 01 12:40:17 crc kubenswrapper[4913]: [+]process-running ok Oct 01 12:40:17 crc kubenswrapper[4913]: healthz check failed Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.463176 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jk8wn" podUID="088bf733-26d8-4478-b6d8-346657f863ac" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.467877 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lhq4g"] Oct 01 12:40:17 crc kubenswrapper[4913]: W1001 12:40:17.476555 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd1e4468_40ba_4a47_8f89_99de7fec4071.slice/crio-2ef51dbd023380cfc77eaf2705a0e6d63ffc7134343f7aa085bc15867fa215a6 WatchSource:0}: Error finding container 2ef51dbd023380cfc77eaf2705a0e6d63ffc7134343f7aa085bc15867fa215a6: Status 404 returned error can't find the container with id 2ef51dbd023380cfc77eaf2705a0e6d63ffc7134343f7aa085bc15867fa215a6 Oct 01 12:40:17 crc kubenswrapper[4913]: I1001 12:40:17.501445 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lsvt4"] Oct 01 12:40:17 crc kubenswrapper[4913]: W1001 12:40:17.508720 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0afa2ed5_a763_46b1_8bcd_c6775b0c7b67.slice/crio-3f4eeeba8653a537b12004cc3479d096f86b825d78814162fe44ee5cb2e1ee24 WatchSource:0}: Error finding container 3f4eeeba8653a537b12004cc3479d096f86b825d78814162fe44ee5cb2e1ee24: Status 404 returned error can't find the container with id 3f4eeeba8653a537b12004cc3479d096f86b825d78814162fe44ee5cb2e1ee24 Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.039648 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1e67eed42b7891e93cfe650a4c0962798baaa4aa657d5a772e1e1290acf98bbe"} Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.039975 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a7c52e88e95696fbe04d63fef28e4e7834394601c7075c7fd80f0fe17d300828"} Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.040176 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.042881 4913 generic.go:334] "Generic (PLEG): container finished" podID="d412bdd1-98a3-4053-a4b5-c43eff851d62" containerID="0ae8118be70aecd8c832f4da2ca52b02ac965eb1f57d099f4bd9d4eb52d67cdc" exitCode=0 Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.043101 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drblq" event={"ID":"d412bdd1-98a3-4053-a4b5-c43eff851d62","Type":"ContainerDied","Data":"0ae8118be70aecd8c832f4da2ca52b02ac965eb1f57d099f4bd9d4eb52d67cdc"} Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.043198 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drblq" event={"ID":"d412bdd1-98a3-4053-a4b5-c43eff851d62","Type":"ContainerStarted","Data":"95243ae463b94b9d82116a789ce2339b7af046e4873b39b7fa1afc8a4a9f5da5"} Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.046046 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.049072 4913 generic.go:334] "Generic (PLEG): container finished" podID="9a5b3988-8324-4401-b951-3b1e3fea763a" containerID="56372340530d13dafd752b211a37f1ef29fd0b1ac228012f2ebff18a381eb831" exitCode=0 Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.049117 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k6h5" event={"ID":"9a5b3988-8324-4401-b951-3b1e3fea763a","Type":"ContainerDied","Data":"56372340530d13dafd752b211a37f1ef29fd0b1ac228012f2ebff18a381eb831"} Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.049134 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k6h5" event={"ID":"9a5b3988-8324-4401-b951-3b1e3fea763a","Type":"ContainerStarted","Data":"e8cb1c5e9305ccdead244dcf4e63665ca74f45b021dea1e405d277d6d0ca5a6d"} Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.064502 4913 generic.go:334] "Generic (PLEG): container finished" podID="89fe354d-c11c-4c4f-a2c8-309d9da44911" containerID="e3efbcebcb8308b748183c5fefa54bb2b3dc25056260e61afb3775c28ee04031" exitCode=0 Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.064614 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrj6h" event={"ID":"89fe354d-c11c-4c4f-a2c8-309d9da44911","Type":"ContainerDied","Data":"e3efbcebcb8308b748183c5fefa54bb2b3dc25056260e61afb3775c28ee04031"} Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.071727 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9chfh"] Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.073066 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.075422 4913 generic.go:334] "Generic (PLEG): container finished" podID="0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" containerID="a75f08dad6c95e1ff3330f8c635d5be6c4e9884e1cbd237d241f7e515bc4f768" exitCode=0 Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.075479 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsvt4" event={"ID":"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67","Type":"ContainerDied","Data":"a75f08dad6c95e1ff3330f8c635d5be6c4e9884e1cbd237d241f7e515bc4f768"} Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.075507 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsvt4" event={"ID":"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67","Type":"ContainerStarted","Data":"3f4eeeba8653a537b12004cc3479d096f86b825d78814162fe44ee5cb2e1ee24"} Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.076610 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.079258 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" event={"ID":"bd1e4468-40ba-4a47-8f89-99de7fec4071","Type":"ContainerStarted","Data":"5244ed9b71f94359f50613196492f3ae22660bab48223204988695cff70b5f94"} Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.079308 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" event={"ID":"bd1e4468-40ba-4a47-8f89-99de7fec4071","Type":"ContainerStarted","Data":"2ef51dbd023380cfc77eaf2705a0e6d63ffc7134343f7aa085bc15867fa215a6"} Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.079417 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.081843 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9chfh"] Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.104802 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e030c584683ae04128852e170d009b649dbe13b273f802d45f116265d74f132d"} Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.104849 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f04dad141a2e18485fae15fd3ae6beb133e81e2e9d2fa736bc7ca98b40d12d2f"} Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.179728 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de27ba3c-2707-4ab6-827e-b9d58f8968da-utilities\") pod \"redhat-marketplace-9chfh\" (UID: \"de27ba3c-2707-4ab6-827e-b9d58f8968da\") " pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.179813 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de27ba3c-2707-4ab6-827e-b9d58f8968da-catalog-content\") pod \"redhat-marketplace-9chfh\" (UID: \"de27ba3c-2707-4ab6-827e-b9d58f8968da\") " pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.179953 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn6nd\" (UniqueName: \"kubernetes.io/projected/de27ba3c-2707-4ab6-827e-b9d58f8968da-kube-api-access-cn6nd\") pod \"redhat-marketplace-9chfh\" (UID: \"de27ba3c-2707-4ab6-827e-b9d58f8968da\") " pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.187767 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" podStartSLOduration=129.187752806 podStartE2EDuration="2m9.187752806s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:18.185200486 +0000 UTC m=+150.088676144" watchObservedRunningTime="2025-10-01 12:40:18.187752806 +0000 UTC m=+150.091228384" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.281592 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de27ba3c-2707-4ab6-827e-b9d58f8968da-utilities\") pod \"redhat-marketplace-9chfh\" (UID: \"de27ba3c-2707-4ab6-827e-b9d58f8968da\") " pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.281663 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de27ba3c-2707-4ab6-827e-b9d58f8968da-catalog-content\") pod \"redhat-marketplace-9chfh\" (UID: \"de27ba3c-2707-4ab6-827e-b9d58f8968da\") " pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.281747 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn6nd\" (UniqueName: \"kubernetes.io/projected/de27ba3c-2707-4ab6-827e-b9d58f8968da-kube-api-access-cn6nd\") pod \"redhat-marketplace-9chfh\" (UID: \"de27ba3c-2707-4ab6-827e-b9d58f8968da\") " pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.282612 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de27ba3c-2707-4ab6-827e-b9d58f8968da-catalog-content\") pod \"redhat-marketplace-9chfh\" (UID: \"de27ba3c-2707-4ab6-827e-b9d58f8968da\") " pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.282610 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de27ba3c-2707-4ab6-827e-b9d58f8968da-utilities\") pod \"redhat-marketplace-9chfh\" (UID: \"de27ba3c-2707-4ab6-827e-b9d58f8968da\") " pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.315283 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn6nd\" (UniqueName: \"kubernetes.io/projected/de27ba3c-2707-4ab6-827e-b9d58f8968da-kube-api-access-cn6nd\") pod \"redhat-marketplace-9chfh\" (UID: \"de27ba3c-2707-4ab6-827e-b9d58f8968da\") " pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.397478 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.461749 4913 patch_prober.go:28] interesting pod/router-default-5444994796-jk8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:40:18 crc kubenswrapper[4913]: [-]has-synced failed: reason withheld Oct 01 12:40:18 crc kubenswrapper[4913]: [+]process-running ok Oct 01 12:40:18 crc kubenswrapper[4913]: healthz check failed Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.461799 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jk8wn" podUID="088bf733-26d8-4478-b6d8-346657f863ac" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.462520 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z84sc"] Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.463631 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.480986 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z84sc"] Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.586880 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cd5196-719c-4700-9091-2a1d43574717-catalog-content\") pod \"redhat-marketplace-z84sc\" (UID: \"e6cd5196-719c-4700-9091-2a1d43574717\") " pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.586942 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9mkn\" (UniqueName: \"kubernetes.io/projected/e6cd5196-719c-4700-9091-2a1d43574717-kube-api-access-c9mkn\") pod \"redhat-marketplace-z84sc\" (UID: \"e6cd5196-719c-4700-9091-2a1d43574717\") " pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.587021 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cd5196-719c-4700-9091-2a1d43574717-utilities\") pod \"redhat-marketplace-z84sc\" (UID: \"e6cd5196-719c-4700-9091-2a1d43574717\") " pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.679423 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9chfh"] Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.689110 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cd5196-719c-4700-9091-2a1d43574717-catalog-content\") pod \"redhat-marketplace-z84sc\" (UID: \"e6cd5196-719c-4700-9091-2a1d43574717\") " pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.689481 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9mkn\" (UniqueName: \"kubernetes.io/projected/e6cd5196-719c-4700-9091-2a1d43574717-kube-api-access-c9mkn\") pod \"redhat-marketplace-z84sc\" (UID: \"e6cd5196-719c-4700-9091-2a1d43574717\") " pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.689535 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cd5196-719c-4700-9091-2a1d43574717-utilities\") pod \"redhat-marketplace-z84sc\" (UID: \"e6cd5196-719c-4700-9091-2a1d43574717\") " pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.689755 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cd5196-719c-4700-9091-2a1d43574717-catalog-content\") pod \"redhat-marketplace-z84sc\" (UID: \"e6cd5196-719c-4700-9091-2a1d43574717\") " pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.690790 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cd5196-719c-4700-9091-2a1d43574717-utilities\") pod \"redhat-marketplace-z84sc\" (UID: \"e6cd5196-719c-4700-9091-2a1d43574717\") " pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.709554 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9mkn\" (UniqueName: \"kubernetes.io/projected/e6cd5196-719c-4700-9091-2a1d43574717-kube-api-access-c9mkn\") pod \"redhat-marketplace-z84sc\" (UID: \"e6cd5196-719c-4700-9091-2a1d43574717\") " pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.775768 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:40:18 crc kubenswrapper[4913]: I1001 12:40:18.814883 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.119503 4913 generic.go:334] "Generic (PLEG): container finished" podID="de27ba3c-2707-4ab6-827e-b9d58f8968da" containerID="508d13c6118b8bf081b6b8849a163de62d8e88c331d8d63763aaa038fddf7b37" exitCode=0 Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.119640 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9chfh" event={"ID":"de27ba3c-2707-4ab6-827e-b9d58f8968da","Type":"ContainerDied","Data":"508d13c6118b8bf081b6b8849a163de62d8e88c331d8d63763aaa038fddf7b37"} Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.119879 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9chfh" event={"ID":"de27ba3c-2707-4ab6-827e-b9d58f8968da","Type":"ContainerStarted","Data":"69a28e9777b605d768f3a97d634e1b6c2dcf23ede2f2d8186282bf4908dcdb7e"} Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.122671 4913 generic.go:334] "Generic (PLEG): container finished" podID="e6fd5584-6878-4be8-83bb-f61003df2639" containerID="1137a411f44765a5a75fe26e2e6736875a8cc4f16724a24f4b6b5191f717df47" exitCode=0 Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.122764 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" event={"ID":"e6fd5584-6878-4be8-83bb-f61003df2639","Type":"ContainerDied","Data":"1137a411f44765a5a75fe26e2e6736875a8cc4f16724a24f4b6b5191f717df47"} Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.166950 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z84sc"] Oct 01 12:40:19 crc kubenswrapper[4913]: W1001 12:40:19.190835 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6cd5196_719c_4700_9091_2a1d43574717.slice/crio-08b24830f0b6573ece0ec120a15519e59e8b834e0e5c06cc76279d1fcc29e27b WatchSource:0}: Error finding container 08b24830f0b6573ece0ec120a15519e59e8b834e0e5c06cc76279d1fcc29e27b: Status 404 returned error can't find the container with id 08b24830f0b6573ece0ec120a15519e59e8b834e0e5c06cc76279d1fcc29e27b Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.192144 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.196212 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-wp7l9" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.280160 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ngwdj"] Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.281111 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.284874 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.292299 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngwdj"] Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.401800 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-catalog-content\") pod \"redhat-operators-ngwdj\" (UID: \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\") " pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.401868 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-utilities\") pod \"redhat-operators-ngwdj\" (UID: \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\") " pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.401888 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb4b8\" (UniqueName: \"kubernetes.io/projected/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-kube-api-access-gb4b8\") pod \"redhat-operators-ngwdj\" (UID: \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\") " pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.463745 4913 patch_prober.go:28] interesting pod/router-default-5444994796-jk8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:40:19 crc kubenswrapper[4913]: [-]has-synced failed: reason withheld Oct 01 12:40:19 crc kubenswrapper[4913]: [+]process-running ok Oct 01 12:40:19 crc kubenswrapper[4913]: healthz check failed Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.464055 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jk8wn" podUID="088bf733-26d8-4478-b6d8-346657f863ac" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.504904 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-catalog-content\") pod \"redhat-operators-ngwdj\" (UID: \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\") " pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.504990 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-utilities\") pod \"redhat-operators-ngwdj\" (UID: \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\") " pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.505009 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb4b8\" (UniqueName: \"kubernetes.io/projected/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-kube-api-access-gb4b8\") pod \"redhat-operators-ngwdj\" (UID: \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\") " pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.505427 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-catalog-content\") pod \"redhat-operators-ngwdj\" (UID: \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\") " pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.505711 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-utilities\") pod \"redhat-operators-ngwdj\" (UID: \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\") " pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.548132 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb4b8\" (UniqueName: \"kubernetes.io/projected/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-kube-api-access-gb4b8\") pod \"redhat-operators-ngwdj\" (UID: \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\") " pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.592321 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.593043 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.595357 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.595556 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.610252 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.623037 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.675425 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xwprc"] Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.676673 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.687424 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xwprc"] Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.709174 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abc86b44-cac0-4c7d-bae3-a6d8890283dc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"abc86b44-cac0-4c7d-bae3-a6d8890283dc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.709236 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abc86b44-cac0-4c7d-bae3-a6d8890283dc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"abc86b44-cac0-4c7d-bae3-a6d8890283dc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.811888 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m26kb\" (UniqueName: \"kubernetes.io/projected/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-kube-api-access-m26kb\") pod \"redhat-operators-xwprc\" (UID: \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\") " pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.811938 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-utilities\") pod \"redhat-operators-xwprc\" (UID: \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\") " pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.811959 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-catalog-content\") pod \"redhat-operators-xwprc\" (UID: \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\") " pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.812008 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abc86b44-cac0-4c7d-bae3-a6d8890283dc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"abc86b44-cac0-4c7d-bae3-a6d8890283dc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.812033 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abc86b44-cac0-4c7d-bae3-a6d8890283dc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"abc86b44-cac0-4c7d-bae3-a6d8890283dc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.813290 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abc86b44-cac0-4c7d-bae3-a6d8890283dc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"abc86b44-cac0-4c7d-bae3-a6d8890283dc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.838221 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abc86b44-cac0-4c7d-bae3-a6d8890283dc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"abc86b44-cac0-4c7d-bae3-a6d8890283dc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.906438 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngwdj"] Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.910971 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.917105 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m26kb\" (UniqueName: \"kubernetes.io/projected/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-kube-api-access-m26kb\") pod \"redhat-operators-xwprc\" (UID: \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\") " pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.917150 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-utilities\") pod \"redhat-operators-xwprc\" (UID: \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\") " pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.917173 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-catalog-content\") pod \"redhat-operators-xwprc\" (UID: \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\") " pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.918398 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-utilities\") pod \"redhat-operators-xwprc\" (UID: \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\") " pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.919585 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-catalog-content\") pod \"redhat-operators-xwprc\" (UID: \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\") " pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:40:19 crc kubenswrapper[4913]: W1001 12:40:19.932121 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a7bb5ac_942e_4b86_ad14_e6dd271d725a.slice/crio-5852d404ef1db53c5269f10d6bdfb77c90f8bb7ffa430bdca1eb2f9292a73f74 WatchSource:0}: Error finding container 5852d404ef1db53c5269f10d6bdfb77c90f8bb7ffa430bdca1eb2f9292a73f74: Status 404 returned error can't find the container with id 5852d404ef1db53c5269f10d6bdfb77c90f8bb7ffa430bdca1eb2f9292a73f74 Oct 01 12:40:19 crc kubenswrapper[4913]: I1001 12:40:19.934577 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m26kb\" (UniqueName: \"kubernetes.io/projected/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-kube-api-access-m26kb\") pod \"redhat-operators-xwprc\" (UID: \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\") " pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.003669 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.003792 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.005668 4913 patch_prober.go:28] interesting pod/console-f9d7485db-97mb9 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.005703 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-97mb9" podUID="4d2bd20a-3d8d-4073-aca4-ceca547c186f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.011620 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.133814 4913 generic.go:334] "Generic (PLEG): container finished" podID="e6cd5196-719c-4700-9091-2a1d43574717" containerID="62c3131845c6dade192fb2da77b0ccf3a23b785bac470328148b18426eb6e783" exitCode=0 Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.133917 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z84sc" event={"ID":"e6cd5196-719c-4700-9091-2a1d43574717","Type":"ContainerDied","Data":"62c3131845c6dade192fb2da77b0ccf3a23b785bac470328148b18426eb6e783"} Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.133959 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z84sc" event={"ID":"e6cd5196-719c-4700-9091-2a1d43574717","Type":"ContainerStarted","Data":"08b24830f0b6573ece0ec120a15519e59e8b834e0e5c06cc76279d1fcc29e27b"} Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.137527 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngwdj" event={"ID":"2a7bb5ac-942e-4b86-ad14-e6dd271d725a","Type":"ContainerStarted","Data":"5852d404ef1db53c5269f10d6bdfb77c90f8bb7ffa430bdca1eb2f9292a73f74"} Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.154080 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.155977 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.160306 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.161868 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.177227 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.187342 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.229309 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa48fe37-156a-4df2-b8c1-6d07961d14d3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fa48fe37-156a-4df2-b8c1-6d07961d14d3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.229520 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa48fe37-156a-4df2-b8c1-6d07961d14d3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fa48fe37-156a-4df2-b8c1-6d07961d14d3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.325451 4913 patch_prober.go:28] interesting pod/downloads-7954f5f757-ssczm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.325471 4913 patch_prober.go:28] interesting pod/downloads-7954f5f757-ssczm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.325506 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ssczm" podUID="a60b32b5-e9f3-4dd1-be69-05d4ec0789fe" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.325528 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ssczm" podUID="a60b32b5-e9f3-4dd1-be69-05d4ec0789fe" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.331598 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa48fe37-156a-4df2-b8c1-6d07961d14d3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fa48fe37-156a-4df2-b8c1-6d07961d14d3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.331730 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa48fe37-156a-4df2-b8c1-6d07961d14d3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fa48fe37-156a-4df2-b8c1-6d07961d14d3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.331801 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa48fe37-156a-4df2-b8c1-6d07961d14d3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fa48fe37-156a-4df2-b8c1-6d07961d14d3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.369922 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa48fe37-156a-4df2-b8c1-6d07961d14d3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fa48fe37-156a-4df2-b8c1-6d07961d14d3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.379631 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xwprc"] Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.458621 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.462248 4913 patch_prober.go:28] interesting pod/router-default-5444994796-jk8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:40:20 crc kubenswrapper[4913]: [-]has-synced failed: reason withheld Oct 01 12:40:20 crc kubenswrapper[4913]: [+]process-running ok Oct 01 12:40:20 crc kubenswrapper[4913]: healthz check failed Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.462326 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jk8wn" podUID="088bf733-26d8-4478-b6d8-346657f863ac" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.473096 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.662227 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.756819 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlxkh\" (UniqueName: \"kubernetes.io/projected/e6fd5584-6878-4be8-83bb-f61003df2639-kube-api-access-tlxkh\") pod \"e6fd5584-6878-4be8-83bb-f61003df2639\" (UID: \"e6fd5584-6878-4be8-83bb-f61003df2639\") " Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.756883 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6fd5584-6878-4be8-83bb-f61003df2639-config-volume\") pod \"e6fd5584-6878-4be8-83bb-f61003df2639\" (UID: \"e6fd5584-6878-4be8-83bb-f61003df2639\") " Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.756955 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6fd5584-6878-4be8-83bb-f61003df2639-secret-volume\") pod \"e6fd5584-6878-4be8-83bb-f61003df2639\" (UID: \"e6fd5584-6878-4be8-83bb-f61003df2639\") " Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.760691 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6fd5584-6878-4be8-83bb-f61003df2639-config-volume" (OuterVolumeSpecName: "config-volume") pod "e6fd5584-6878-4be8-83bb-f61003df2639" (UID: "e6fd5584-6878-4be8-83bb-f61003df2639"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.762633 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fd5584-6878-4be8-83bb-f61003df2639-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e6fd5584-6878-4be8-83bb-f61003df2639" (UID: "e6fd5584-6878-4be8-83bb-f61003df2639"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.770552 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6fd5584-6878-4be8-83bb-f61003df2639-kube-api-access-tlxkh" (OuterVolumeSpecName: "kube-api-access-tlxkh") pod "e6fd5584-6878-4be8-83bb-f61003df2639" (UID: "e6fd5584-6878-4be8-83bb-f61003df2639"). InnerVolumeSpecName "kube-api-access-tlxkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.861572 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlxkh\" (UniqueName: \"kubernetes.io/projected/e6fd5584-6878-4be8-83bb-f61003df2639-kube-api-access-tlxkh\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.861902 4913 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6fd5584-6878-4be8-83bb-f61003df2639-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.861913 4913 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6fd5584-6878-4be8-83bb-f61003df2639-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:20 crc kubenswrapper[4913]: I1001 12:40:20.903765 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 12:40:21 crc kubenswrapper[4913]: I1001 12:40:21.160311 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" event={"ID":"e6fd5584-6878-4be8-83bb-f61003df2639","Type":"ContainerDied","Data":"acbc30e033e995bbb1deaf0de0d3d99f57a80ea62aed466d6e698903c3267bfa"} Oct 01 12:40:21 crc kubenswrapper[4913]: I1001 12:40:21.160340 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc" Oct 01 12:40:21 crc kubenswrapper[4913]: I1001 12:40:21.160354 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acbc30e033e995bbb1deaf0de0d3d99f57a80ea62aed466d6e698903c3267bfa" Oct 01 12:40:21 crc kubenswrapper[4913]: I1001 12:40:21.173923 4913 generic.go:334] "Generic (PLEG): container finished" podID="a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" containerID="f99cc3d78f45cbb9836df457df720dd34468fc3e9515fcc99bd27ad9593fe132" exitCode=0 Oct 01 12:40:21 crc kubenswrapper[4913]: I1001 12:40:21.174108 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwprc" event={"ID":"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7","Type":"ContainerDied","Data":"f99cc3d78f45cbb9836df457df720dd34468fc3e9515fcc99bd27ad9593fe132"} Oct 01 12:40:21 crc kubenswrapper[4913]: I1001 12:40:21.174182 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwprc" event={"ID":"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7","Type":"ContainerStarted","Data":"15db3cce43246d2ff927226b2da8b021af66a3dd768b75c253f96303ae42e36f"} Oct 01 12:40:21 crc kubenswrapper[4913]: I1001 12:40:21.177508 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fa48fe37-156a-4df2-b8c1-6d07961d14d3","Type":"ContainerStarted","Data":"527e5929ac06f38a0c4474a223fbd2637a22d1c5934337cfbba383d7a2edcd10"} Oct 01 12:40:21 crc kubenswrapper[4913]: I1001 12:40:21.189089 4913 generic.go:334] "Generic (PLEG): container finished" podID="2a7bb5ac-942e-4b86-ad14-e6dd271d725a" containerID="ca413b07e5d6237f3c4e815f51587abadeb263ec2e5196082fff58c46bc44d07" exitCode=0 Oct 01 12:40:21 crc kubenswrapper[4913]: I1001 12:40:21.189235 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngwdj" event={"ID":"2a7bb5ac-942e-4b86-ad14-e6dd271d725a","Type":"ContainerDied","Data":"ca413b07e5d6237f3c4e815f51587abadeb263ec2e5196082fff58c46bc44d07"} Oct 01 12:40:21 crc kubenswrapper[4913]: I1001 12:40:21.192583 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"abc86b44-cac0-4c7d-bae3-a6d8890283dc","Type":"ContainerStarted","Data":"e65ce5647f2a6fa241fe03691c3a00e57feed40dd47ba3a2709f9f77fc96eadd"} Oct 01 12:40:21 crc kubenswrapper[4913]: I1001 12:40:21.192608 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"abc86b44-cac0-4c7d-bae3-a6d8890283dc","Type":"ContainerStarted","Data":"fa731ea23eb7819f8cffbb23d1d8293cb9a6317e75c98f3537a989b2cbedee76"} Oct 01 12:40:21 crc kubenswrapper[4913]: I1001 12:40:21.462259 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:21 crc kubenswrapper[4913]: I1001 12:40:21.465043 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jk8wn" Oct 01 12:40:21 crc kubenswrapper[4913]: I1001 12:40:21.482576 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.482560913 podStartE2EDuration="2.482560913s" podCreationTimestamp="2025-10-01 12:40:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:21.21760014 +0000 UTC m=+153.121075728" watchObservedRunningTime="2025-10-01 12:40:21.482560913 +0000 UTC m=+153.386036491" Oct 01 12:40:22 crc kubenswrapper[4913]: I1001 12:40:22.216241 4913 generic.go:334] "Generic (PLEG): container finished" podID="abc86b44-cac0-4c7d-bae3-a6d8890283dc" containerID="e65ce5647f2a6fa241fe03691c3a00e57feed40dd47ba3a2709f9f77fc96eadd" exitCode=0 Oct 01 12:40:22 crc kubenswrapper[4913]: I1001 12:40:22.216340 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"abc86b44-cac0-4c7d-bae3-a6d8890283dc","Type":"ContainerDied","Data":"e65ce5647f2a6fa241fe03691c3a00e57feed40dd47ba3a2709f9f77fc96eadd"} Oct 01 12:40:22 crc kubenswrapper[4913]: I1001 12:40:22.222310 4913 generic.go:334] "Generic (PLEG): container finished" podID="fa48fe37-156a-4df2-b8c1-6d07961d14d3" containerID="b7eae18a61d8106054df26d75c147a71a1b7d3050e4e0608f17729a0636767fb" exitCode=0 Oct 01 12:40:22 crc kubenswrapper[4913]: I1001 12:40:22.222409 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fa48fe37-156a-4df2-b8c1-6d07961d14d3","Type":"ContainerDied","Data":"b7eae18a61d8106054df26d75c147a71a1b7d3050e4e0608f17729a0636767fb"} Oct 01 12:40:23 crc kubenswrapper[4913]: I1001 12:40:23.506437 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:40:23 crc kubenswrapper[4913]: I1001 12:40:23.604950 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abc86b44-cac0-4c7d-bae3-a6d8890283dc-kubelet-dir\") pod \"abc86b44-cac0-4c7d-bae3-a6d8890283dc\" (UID: \"abc86b44-cac0-4c7d-bae3-a6d8890283dc\") " Oct 01 12:40:23 crc kubenswrapper[4913]: I1001 12:40:23.604999 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abc86b44-cac0-4c7d-bae3-a6d8890283dc-kube-api-access\") pod \"abc86b44-cac0-4c7d-bae3-a6d8890283dc\" (UID: \"abc86b44-cac0-4c7d-bae3-a6d8890283dc\") " Oct 01 12:40:23 crc kubenswrapper[4913]: I1001 12:40:23.605068 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abc86b44-cac0-4c7d-bae3-a6d8890283dc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "abc86b44-cac0-4c7d-bae3-a6d8890283dc" (UID: "abc86b44-cac0-4c7d-bae3-a6d8890283dc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:40:23 crc kubenswrapper[4913]: I1001 12:40:23.605252 4913 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abc86b44-cac0-4c7d-bae3-a6d8890283dc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:23 crc kubenswrapper[4913]: I1001 12:40:23.611229 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc86b44-cac0-4c7d-bae3-a6d8890283dc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "abc86b44-cac0-4c7d-bae3-a6d8890283dc" (UID: "abc86b44-cac0-4c7d-bae3-a6d8890283dc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:40:23 crc kubenswrapper[4913]: I1001 12:40:23.652734 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:40:23 crc kubenswrapper[4913]: I1001 12:40:23.706133 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abc86b44-cac0-4c7d-bae3-a6d8890283dc-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:23 crc kubenswrapper[4913]: I1001 12:40:23.806844 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa48fe37-156a-4df2-b8c1-6d07961d14d3-kubelet-dir\") pod \"fa48fe37-156a-4df2-b8c1-6d07961d14d3\" (UID: \"fa48fe37-156a-4df2-b8c1-6d07961d14d3\") " Oct 01 12:40:23 crc kubenswrapper[4913]: I1001 12:40:23.806930 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa48fe37-156a-4df2-b8c1-6d07961d14d3-kube-api-access\") pod \"fa48fe37-156a-4df2-b8c1-6d07961d14d3\" (UID: \"fa48fe37-156a-4df2-b8c1-6d07961d14d3\") " Oct 01 12:40:23 crc kubenswrapper[4913]: I1001 12:40:23.807687 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa48fe37-156a-4df2-b8c1-6d07961d14d3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fa48fe37-156a-4df2-b8c1-6d07961d14d3" (UID: "fa48fe37-156a-4df2-b8c1-6d07961d14d3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:40:23 crc kubenswrapper[4913]: I1001 12:40:23.809829 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa48fe37-156a-4df2-b8c1-6d07961d14d3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fa48fe37-156a-4df2-b8c1-6d07961d14d3" (UID: "fa48fe37-156a-4df2-b8c1-6d07961d14d3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:40:23 crc kubenswrapper[4913]: I1001 12:40:23.908776 4913 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa48fe37-156a-4df2-b8c1-6d07961d14d3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:23 crc kubenswrapper[4913]: I1001 12:40:23.908807 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa48fe37-156a-4df2-b8c1-6d07961d14d3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:24 crc kubenswrapper[4913]: I1001 12:40:24.236439 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fa48fe37-156a-4df2-b8c1-6d07961d14d3","Type":"ContainerDied","Data":"527e5929ac06f38a0c4474a223fbd2637a22d1c5934337cfbba383d7a2edcd10"} Oct 01 12:40:24 crc kubenswrapper[4913]: I1001 12:40:24.236473 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="527e5929ac06f38a0c4474a223fbd2637a22d1c5934337cfbba383d7a2edcd10" Oct 01 12:40:24 crc kubenswrapper[4913]: I1001 12:40:24.236493 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:40:24 crc kubenswrapper[4913]: I1001 12:40:24.240491 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"abc86b44-cac0-4c7d-bae3-a6d8890283dc","Type":"ContainerDied","Data":"fa731ea23eb7819f8cffbb23d1d8293cb9a6317e75c98f3537a989b2cbedee76"} Oct 01 12:40:24 crc kubenswrapper[4913]: I1001 12:40:24.240527 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:40:24 crc kubenswrapper[4913]: I1001 12:40:24.240584 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa731ea23eb7819f8cffbb23d1d8293cb9a6317e75c98f3537a989b2cbedee76" Oct 01 12:40:25 crc kubenswrapper[4913]: I1001 12:40:25.528225 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vqbl5" Oct 01 12:40:28 crc kubenswrapper[4913]: I1001 12:40:28.038861 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:40:30 crc kubenswrapper[4913]: I1001 12:40:30.035289 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:30 crc kubenswrapper[4913]: I1001 12:40:30.043629 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:40:30 crc kubenswrapper[4913]: I1001 12:40:30.318676 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ssczm" Oct 01 12:40:31 crc kubenswrapper[4913]: I1001 12:40:31.731079 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs\") pod \"network-metrics-daemon-8c8wp\" (UID: \"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\") " pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:40:31 crc kubenswrapper[4913]: I1001 12:40:31.736389 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18-metrics-certs\") pod \"network-metrics-daemon-8c8wp\" (UID: \"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18\") " pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:40:31 crc kubenswrapper[4913]: I1001 12:40:31.955478 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8c8wp" Oct 01 12:40:37 crc kubenswrapper[4913]: I1001 12:40:37.247732 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:40:40 crc kubenswrapper[4913]: I1001 12:40:40.083709 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:40:40 crc kubenswrapper[4913]: I1001 12:40:40.084361 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:40:42 crc kubenswrapper[4913]: I1001 12:40:42.051586 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8c8wp"] Oct 01 12:40:42 crc kubenswrapper[4913]: E1001 12:40:42.482011 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 01 12:40:42 crc kubenswrapper[4913]: E1001 12:40:42.482511 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cn6nd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9chfh_openshift-marketplace(de27ba3c-2707-4ab6-827e-b9d58f8968da): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 12:40:42 crc kubenswrapper[4913]: E1001 12:40:42.483742 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9chfh" podUID="de27ba3c-2707-4ab6-827e-b9d58f8968da" Oct 01 12:40:43 crc kubenswrapper[4913]: E1001 12:40:43.560152 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 01 12:40:43 crc kubenswrapper[4913]: E1001 12:40:43.560600 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c9mkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-z84sc_openshift-marketplace(e6cd5196-719c-4700-9091-2a1d43574717): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 12:40:43 crc kubenswrapper[4913]: E1001 12:40:43.561708 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-z84sc" podUID="e6cd5196-719c-4700-9091-2a1d43574717" Oct 01 12:40:43 crc kubenswrapper[4913]: E1001 12:40:43.562352 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 01 12:40:43 crc kubenswrapper[4913]: E1001 12:40:43.562453 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58q5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lsvt4_openshift-marketplace(0afa2ed5-a763-46b1-8bcd-c6775b0c7b67): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 12:40:43 crc kubenswrapper[4913]: E1001 12:40:43.563593 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lsvt4" podUID="0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" Oct 01 12:40:47 crc kubenswrapper[4913]: W1001 12:40:47.081485 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1530fa1_ff03_4aa3_8a45_a5af1dbf4b18.slice/crio-b382a9f9692acc0e472ad221f4a37c76fa1c8aa24bab9f2ceaf6eea7ca446503 WatchSource:0}: Error finding container b382a9f9692acc0e472ad221f4a37c76fa1c8aa24bab9f2ceaf6eea7ca446503: Status 404 returned error can't find the container with id b382a9f9692acc0e472ad221f4a37c76fa1c8aa24bab9f2ceaf6eea7ca446503 Oct 01 12:40:47 crc kubenswrapper[4913]: E1001 12:40:47.081583 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-z84sc" podUID="e6cd5196-719c-4700-9091-2a1d43574717" Oct 01 12:40:47 crc kubenswrapper[4913]: E1001 12:40:47.081612 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9chfh" podUID="de27ba3c-2707-4ab6-827e-b9d58f8968da" Oct 01 12:40:47 crc kubenswrapper[4913]: E1001 12:40:47.081606 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-lsvt4" podUID="0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" Oct 01 12:40:47 crc kubenswrapper[4913]: I1001 12:40:47.364862 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" event={"ID":"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18","Type":"ContainerStarted","Data":"b382a9f9692acc0e472ad221f4a37c76fa1c8aa24bab9f2ceaf6eea7ca446503"} Oct 01 12:40:48 crc kubenswrapper[4913]: I1001 12:40:48.371185 4913 generic.go:334] "Generic (PLEG): container finished" podID="2a7bb5ac-942e-4b86-ad14-e6dd271d725a" containerID="7964bf01594497ecd6bbece1f4066551345e08e83f4dc9746f9bbe01c69394ca" exitCode=0 Oct 01 12:40:48 crc kubenswrapper[4913]: I1001 12:40:48.371449 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngwdj" event={"ID":"2a7bb5ac-942e-4b86-ad14-e6dd271d725a","Type":"ContainerDied","Data":"7964bf01594497ecd6bbece1f4066551345e08e83f4dc9746f9bbe01c69394ca"} Oct 01 12:40:48 crc kubenswrapper[4913]: I1001 12:40:48.373714 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" event={"ID":"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18","Type":"ContainerStarted","Data":"07b716669e065506f39993af77eeaf02a94d20dbedebf712f0897f85c343e61f"} Oct 01 12:40:48 crc kubenswrapper[4913]: I1001 12:40:48.373771 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8c8wp" event={"ID":"c1530fa1-ff03-4aa3-8a45-a5af1dbf4b18","Type":"ContainerStarted","Data":"127ace9a1287f05e82db8ee0460bacd1ce97655514f6d6b305f9c1d89c200be2"} Oct 01 12:40:48 crc kubenswrapper[4913]: I1001 12:40:48.375642 4913 generic.go:334] "Generic (PLEG): container finished" podID="89fe354d-c11c-4c4f-a2c8-309d9da44911" containerID="5314d4b822c9077f3b8a706a7e14386a1d560320b4403d400b30d3b563be1e12" exitCode=0 Oct 01 12:40:48 crc kubenswrapper[4913]: I1001 12:40:48.375742 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrj6h" event={"ID":"89fe354d-c11c-4c4f-a2c8-309d9da44911","Type":"ContainerDied","Data":"5314d4b822c9077f3b8a706a7e14386a1d560320b4403d400b30d3b563be1e12"} Oct 01 12:40:48 crc kubenswrapper[4913]: I1001 12:40:48.377838 4913 generic.go:334] "Generic (PLEG): container finished" podID="a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" containerID="9a5a34f93ddb73cff664745ce8c80377057ed373179f6e9231fd79c21b1eea97" exitCode=0 Oct 01 12:40:48 crc kubenswrapper[4913]: I1001 12:40:48.377904 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwprc" event={"ID":"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7","Type":"ContainerDied","Data":"9a5a34f93ddb73cff664745ce8c80377057ed373179f6e9231fd79c21b1eea97"} Oct 01 12:40:48 crc kubenswrapper[4913]: I1001 12:40:48.382411 4913 generic.go:334] "Generic (PLEG): container finished" podID="d412bdd1-98a3-4053-a4b5-c43eff851d62" containerID="fb9f1aea2c440097d740ee8abdca4ea3cc4c054bfd7423796ebc025a180c04a8" exitCode=0 Oct 01 12:40:48 crc kubenswrapper[4913]: I1001 12:40:48.382476 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drblq" event={"ID":"d412bdd1-98a3-4053-a4b5-c43eff851d62","Type":"ContainerDied","Data":"fb9f1aea2c440097d740ee8abdca4ea3cc4c054bfd7423796ebc025a180c04a8"} Oct 01 12:40:48 crc kubenswrapper[4913]: I1001 12:40:48.387724 4913 generic.go:334] "Generic (PLEG): container finished" podID="9a5b3988-8324-4401-b951-3b1e3fea763a" containerID="0b47094481fdeedd251ea2178fea3907ee0401261fe356c8e890888e0c37506f" exitCode=0 Oct 01 12:40:48 crc kubenswrapper[4913]: I1001 12:40:48.387787 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k6h5" event={"ID":"9a5b3988-8324-4401-b951-3b1e3fea763a","Type":"ContainerDied","Data":"0b47094481fdeedd251ea2178fea3907ee0401261fe356c8e890888e0c37506f"} Oct 01 12:40:48 crc kubenswrapper[4913]: I1001 12:40:48.411619 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8c8wp" podStartSLOduration=159.411594026 podStartE2EDuration="2m39.411594026s" podCreationTimestamp="2025-10-01 12:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:48.406115208 +0000 UTC m=+180.309590816" watchObservedRunningTime="2025-10-01 12:40:48.411594026 +0000 UTC m=+180.315069634" Oct 01 12:40:50 crc kubenswrapper[4913]: I1001 12:40:50.473132 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-stkv5" Oct 01 12:40:51 crc kubenswrapper[4913]: I1001 12:40:51.405888 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drblq" event={"ID":"d412bdd1-98a3-4053-a4b5-c43eff851d62","Type":"ContainerStarted","Data":"7a96a5bd9097f18132ae87acee08e9079d7fa2bfdb178a519b1d8ae5a5908c92"} Oct 01 12:40:51 crc kubenswrapper[4913]: I1001 12:40:51.428440 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-drblq" podStartSLOduration=2.821833339 podStartE2EDuration="35.428419047s" podCreationTimestamp="2025-10-01 12:40:16 +0000 UTC" firstStartedPulling="2025-10-01 12:40:18.045806822 +0000 UTC m=+149.949282400" lastFinishedPulling="2025-10-01 12:40:50.65239254 +0000 UTC m=+182.555868108" observedRunningTime="2025-10-01 12:40:51.424556903 +0000 UTC m=+183.328032601" watchObservedRunningTime="2025-10-01 12:40:51.428419047 +0000 UTC m=+183.331894635" Oct 01 12:40:53 crc kubenswrapper[4913]: I1001 12:40:53.418241 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwprc" event={"ID":"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7","Type":"ContainerStarted","Data":"17121d3b6537ff0a8d63498f342c7476d894c2e22eaea36d2953f6114508a8e9"} Oct 01 12:40:53 crc kubenswrapper[4913]: I1001 12:40:53.440704 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xwprc" podStartSLOduration=3.301707511 podStartE2EDuration="34.440687637s" podCreationTimestamp="2025-10-01 12:40:19 +0000 UTC" firstStartedPulling="2025-10-01 12:40:21.184552872 +0000 UTC m=+153.088028450" lastFinishedPulling="2025-10-01 12:40:52.323532998 +0000 UTC m=+184.227008576" observedRunningTime="2025-10-01 12:40:53.437409278 +0000 UTC m=+185.340884866" watchObservedRunningTime="2025-10-01 12:40:53.440687637 +0000 UTC m=+185.344163205" Oct 01 12:40:54 crc kubenswrapper[4913]: I1001 12:40:54.424416 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngwdj" event={"ID":"2a7bb5ac-942e-4b86-ad14-e6dd271d725a","Type":"ContainerStarted","Data":"99b33aaca64990db7675d0bdfb78859f629dd14144e637cfaca966e1dbb98091"} Oct 01 12:40:54 crc kubenswrapper[4913]: I1001 12:40:54.426741 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k6h5" event={"ID":"9a5b3988-8324-4401-b951-3b1e3fea763a","Type":"ContainerStarted","Data":"9d88b9cedd658546c8a390a3d224c107b790388b961e27026f428156c4d4fd9c"} Oct 01 12:40:54 crc kubenswrapper[4913]: I1001 12:40:54.428928 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrj6h" event={"ID":"89fe354d-c11c-4c4f-a2c8-309d9da44911","Type":"ContainerStarted","Data":"a0e4d9a12c449a729d7c2a65d470883b80655598903d0e12f154aac1a0e46af3"} Oct 01 12:40:54 crc kubenswrapper[4913]: I1001 12:40:54.444646 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ngwdj" podStartSLOduration=3.143388243 podStartE2EDuration="35.444622572s" podCreationTimestamp="2025-10-01 12:40:19 +0000 UTC" firstStartedPulling="2025-10-01 12:40:21.19293651 +0000 UTC m=+153.096412078" lastFinishedPulling="2025-10-01 12:40:53.494170829 +0000 UTC m=+185.397646407" observedRunningTime="2025-10-01 12:40:54.441337633 +0000 UTC m=+186.344813231" watchObservedRunningTime="2025-10-01 12:40:54.444622572 +0000 UTC m=+186.348098150" Oct 01 12:40:54 crc kubenswrapper[4913]: I1001 12:40:54.467388 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4k6h5" podStartSLOduration=2.92468281 podStartE2EDuration="38.467366989s" podCreationTimestamp="2025-10-01 12:40:16 +0000 UTC" firstStartedPulling="2025-10-01 12:40:18.050012675 +0000 UTC m=+149.953488253" lastFinishedPulling="2025-10-01 12:40:53.592696854 +0000 UTC m=+185.496172432" observedRunningTime="2025-10-01 12:40:54.46295149 +0000 UTC m=+186.366427088" watchObservedRunningTime="2025-10-01 12:40:54.467366989 +0000 UTC m=+186.370842567" Oct 01 12:40:54 crc kubenswrapper[4913]: I1001 12:40:54.488029 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qrj6h" podStartSLOduration=3.069786351 podStartE2EDuration="38.48800805s" podCreationTimestamp="2025-10-01 12:40:16 +0000 UTC" firstStartedPulling="2025-10-01 12:40:18.068819406 +0000 UTC m=+149.972294984" lastFinishedPulling="2025-10-01 12:40:53.487041105 +0000 UTC m=+185.390516683" observedRunningTime="2025-10-01 12:40:54.486107158 +0000 UTC m=+186.389582806" watchObservedRunningTime="2025-10-01 12:40:54.48800805 +0000 UTC m=+186.391483628" Oct 01 12:40:56 crc kubenswrapper[4913]: I1001 12:40:56.233856 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:40:56 crc kubenswrapper[4913]: I1001 12:40:56.456116 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:40:56 crc kubenswrapper[4913]: I1001 12:40:56.456161 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:40:56 crc kubenswrapper[4913]: I1001 12:40:56.665629 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:56 crc kubenswrapper[4913]: I1001 12:40:56.675358 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:56 crc kubenswrapper[4913]: I1001 12:40:56.709906 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:56 crc kubenswrapper[4913]: I1001 12:40:56.711396 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:40:56 crc kubenswrapper[4913]: I1001 12:40:56.815385 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:56 crc kubenswrapper[4913]: I1001 12:40:56.815700 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:56 crc kubenswrapper[4913]: I1001 12:40:56.854421 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:57 crc kubenswrapper[4913]: I1001 12:40:57.478576 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:40:58 crc kubenswrapper[4913]: I1001 12:40:58.481383 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-drblq"] Oct 01 12:40:58 crc kubenswrapper[4913]: I1001 12:40:58.510462 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:40:59 crc kubenswrapper[4913]: I1001 12:40:59.455158 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-drblq" podUID="d412bdd1-98a3-4053-a4b5-c43eff851d62" containerName="registry-server" containerID="cri-o://7a96a5bd9097f18132ae87acee08e9079d7fa2bfdb178a519b1d8ae5a5908c92" gracePeriod=2 Oct 01 12:40:59 crc kubenswrapper[4913]: I1001 12:40:59.627515 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:40:59 crc kubenswrapper[4913]: I1001 12:40:59.627564 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:40:59 crc kubenswrapper[4913]: I1001 12:40:59.671919 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:41:00 crc kubenswrapper[4913]: I1001 12:41:00.012315 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:41:00 crc kubenswrapper[4913]: I1001 12:41:00.012691 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:41:00 crc kubenswrapper[4913]: I1001 12:41:00.078993 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:41:00 crc kubenswrapper[4913]: I1001 12:41:00.462152 4913 generic.go:334] "Generic (PLEG): container finished" podID="d412bdd1-98a3-4053-a4b5-c43eff851d62" containerID="7a96a5bd9097f18132ae87acee08e9079d7fa2bfdb178a519b1d8ae5a5908c92" exitCode=0 Oct 01 12:41:00 crc kubenswrapper[4913]: I1001 12:41:00.462237 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drblq" event={"ID":"d412bdd1-98a3-4053-a4b5-c43eff851d62","Type":"ContainerDied","Data":"7a96a5bd9097f18132ae87acee08e9079d7fa2bfdb178a519b1d8ae5a5908c92"} Oct 01 12:41:00 crc kubenswrapper[4913]: I1001 12:41:00.495618 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:41:00 crc kubenswrapper[4913]: I1001 12:41:00.562522 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.243447 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.318866 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d412bdd1-98a3-4053-a4b5-c43eff851d62-utilities\") pod \"d412bdd1-98a3-4053-a4b5-c43eff851d62\" (UID: \"d412bdd1-98a3-4053-a4b5-c43eff851d62\") " Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.319077 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b4cc\" (UniqueName: \"kubernetes.io/projected/d412bdd1-98a3-4053-a4b5-c43eff851d62-kube-api-access-7b4cc\") pod \"d412bdd1-98a3-4053-a4b5-c43eff851d62\" (UID: \"d412bdd1-98a3-4053-a4b5-c43eff851d62\") " Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.319128 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d412bdd1-98a3-4053-a4b5-c43eff851d62-catalog-content\") pod \"d412bdd1-98a3-4053-a4b5-c43eff851d62\" (UID: \"d412bdd1-98a3-4053-a4b5-c43eff851d62\") " Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.319568 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d412bdd1-98a3-4053-a4b5-c43eff851d62-utilities" (OuterVolumeSpecName: "utilities") pod "d412bdd1-98a3-4053-a4b5-c43eff851d62" (UID: "d412bdd1-98a3-4053-a4b5-c43eff851d62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.324074 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d412bdd1-98a3-4053-a4b5-c43eff851d62-kube-api-access-7b4cc" (OuterVolumeSpecName: "kube-api-access-7b4cc") pod "d412bdd1-98a3-4053-a4b5-c43eff851d62" (UID: "d412bdd1-98a3-4053-a4b5-c43eff851d62"). InnerVolumeSpecName "kube-api-access-7b4cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.365798 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d412bdd1-98a3-4053-a4b5-c43eff851d62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d412bdd1-98a3-4053-a4b5-c43eff851d62" (UID: "d412bdd1-98a3-4053-a4b5-c43eff851d62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.420904 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b4cc\" (UniqueName: \"kubernetes.io/projected/d412bdd1-98a3-4053-a4b5-c43eff851d62-kube-api-access-7b4cc\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.420946 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d412bdd1-98a3-4053-a4b5-c43eff851d62-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.420956 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d412bdd1-98a3-4053-a4b5-c43eff851d62-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.469238 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drblq" event={"ID":"d412bdd1-98a3-4053-a4b5-c43eff851d62","Type":"ContainerDied","Data":"95243ae463b94b9d82116a789ce2339b7af046e4873b39b7fa1afc8a4a9f5da5"} Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.469333 4913 scope.go:117] "RemoveContainer" containerID="7a96a5bd9097f18132ae87acee08e9079d7fa2bfdb178a519b1d8ae5a5908c92" Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.469451 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drblq" Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.486690 4913 scope.go:117] "RemoveContainer" containerID="fb9f1aea2c440097d740ee8abdca4ea3cc4c054bfd7423796ebc025a180c04a8" Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.503889 4913 scope.go:117] "RemoveContainer" containerID="0ae8118be70aecd8c832f4da2ca52b02ac965eb1f57d099f4bd9d4eb52d67cdc" Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.529057 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-drblq"] Oct 01 12:41:01 crc kubenswrapper[4913]: I1001 12:41:01.532670 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-drblq"] Oct 01 12:41:02 crc kubenswrapper[4913]: I1001 12:41:02.812463 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d412bdd1-98a3-4053-a4b5-c43eff851d62" path="/var/lib/kubelet/pods/d412bdd1-98a3-4053-a4b5-c43eff851d62/volumes" Oct 01 12:41:03 crc kubenswrapper[4913]: I1001 12:41:03.078149 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xwprc"] Oct 01 12:41:03 crc kubenswrapper[4913]: I1001 12:41:03.478629 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xwprc" podUID="a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" containerName="registry-server" containerID="cri-o://17121d3b6537ff0a8d63498f342c7476d894c2e22eaea36d2953f6114508a8e9" gracePeriod=2 Oct 01 12:41:04 crc kubenswrapper[4913]: I1001 12:41:04.486077 4913 generic.go:334] "Generic (PLEG): container finished" podID="a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" containerID="17121d3b6537ff0a8d63498f342c7476d894c2e22eaea36d2953f6114508a8e9" exitCode=0 Oct 01 12:41:04 crc kubenswrapper[4913]: I1001 12:41:04.486125 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwprc" event={"ID":"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7","Type":"ContainerDied","Data":"17121d3b6537ff0a8d63498f342c7476d894c2e22eaea36d2953f6114508a8e9"} Oct 01 12:41:05 crc kubenswrapper[4913]: I1001 12:41:05.749122 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:41:05 crc kubenswrapper[4913]: I1001 12:41:05.877638 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-utilities\") pod \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\" (UID: \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\") " Oct 01 12:41:05 crc kubenswrapper[4913]: I1001 12:41:05.878074 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-catalog-content\") pod \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\" (UID: \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\") " Oct 01 12:41:05 crc kubenswrapper[4913]: I1001 12:41:05.878099 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m26kb\" (UniqueName: \"kubernetes.io/projected/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-kube-api-access-m26kb\") pod \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\" (UID: \"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7\") " Oct 01 12:41:05 crc kubenswrapper[4913]: I1001 12:41:05.880560 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-utilities" (OuterVolumeSpecName: "utilities") pod "a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" (UID: "a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:05 crc kubenswrapper[4913]: I1001 12:41:05.885657 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-kube-api-access-m26kb" (OuterVolumeSpecName: "kube-api-access-m26kb") pod "a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" (UID: "a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7"). InnerVolumeSpecName "kube-api-access-m26kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:41:05 crc kubenswrapper[4913]: I1001 12:41:05.979681 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:05 crc kubenswrapper[4913]: I1001 12:41:05.979709 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m26kb\" (UniqueName: \"kubernetes.io/projected/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-kube-api-access-m26kb\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:05 crc kubenswrapper[4913]: I1001 12:41:05.989050 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" (UID: "a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.080927 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.493355 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.504657 4913 generic.go:334] "Generic (PLEG): container finished" podID="de27ba3c-2707-4ab6-827e-b9d58f8968da" containerID="a1bbeb3f808c49b76eb8d458723f010c495132e001f6fd9b84cfd33dfaeb8f97" exitCode=0 Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.504716 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9chfh" event={"ID":"de27ba3c-2707-4ab6-827e-b9d58f8968da","Type":"ContainerDied","Data":"a1bbeb3f808c49b76eb8d458723f010c495132e001f6fd9b84cfd33dfaeb8f97"} Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.511073 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z84sc" event={"ID":"e6cd5196-719c-4700-9091-2a1d43574717","Type":"ContainerDied","Data":"ecab714ba3047e524b6b81c7c3aae64de6ecc88477929ab4f0185d6fc9b9cbfe"} Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.511895 4913 generic.go:334] "Generic (PLEG): container finished" podID="e6cd5196-719c-4700-9091-2a1d43574717" containerID="ecab714ba3047e524b6b81c7c3aae64de6ecc88477929ab4f0185d6fc9b9cbfe" exitCode=0 Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.517137 4913 generic.go:334] "Generic (PLEG): container finished" podID="0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" containerID="fbfdefcb9689c8c0370c8cee24eef6a14a9a1caf6d4e6588500c52776e9304b2" exitCode=0 Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.517205 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsvt4" event={"ID":"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67","Type":"ContainerDied","Data":"fbfdefcb9689c8c0370c8cee24eef6a14a9a1caf6d4e6588500c52776e9304b2"} Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.522059 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwprc" event={"ID":"a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7","Type":"ContainerDied","Data":"15db3cce43246d2ff927226b2da8b021af66a3dd768b75c253f96303ae42e36f"} Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.522108 4913 scope.go:117] "RemoveContainer" containerID="17121d3b6537ff0a8d63498f342c7476d894c2e22eaea36d2953f6114508a8e9" Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.522239 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwprc" Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.551971 4913 scope.go:117] "RemoveContainer" containerID="9a5a34f93ddb73cff664745ce8c80377057ed373179f6e9231fd79c21b1eea97" Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.575150 4913 scope.go:117] "RemoveContainer" containerID="f99cc3d78f45cbb9836df457df720dd34468fc3e9515fcc99bd27ad9593fe132" Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.588819 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xwprc"] Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.591378 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xwprc"] Oct 01 12:41:06 crc kubenswrapper[4913]: I1001 12:41:06.818581 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" path="/var/lib/kubelet/pods/a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7/volumes" Oct 01 12:41:07 crc kubenswrapper[4913]: I1001 12:41:07.529637 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9chfh" event={"ID":"de27ba3c-2707-4ab6-827e-b9d58f8968da","Type":"ContainerStarted","Data":"7b2a098cf1ccaf3a51083a01be58f77eb0606e21a5c2bc10772307f67912e20a"} Oct 01 12:41:07 crc kubenswrapper[4913]: I1001 12:41:07.532847 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z84sc" event={"ID":"e6cd5196-719c-4700-9091-2a1d43574717","Type":"ContainerStarted","Data":"dee148e0581761b609104633d0d6e8d2d3dec3251f7d0458a1b48579981d1a7b"} Oct 01 12:41:07 crc kubenswrapper[4913]: I1001 12:41:07.535982 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsvt4" event={"ID":"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67","Type":"ContainerStarted","Data":"311ef0b8036a904b9e274959057f5d953b1bfa7fae3d2ae3347b5b7f5a2f9337"} Oct 01 12:41:07 crc kubenswrapper[4913]: I1001 12:41:07.552964 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9chfh" podStartSLOduration=1.7342435680000001 podStartE2EDuration="49.552946527s" podCreationTimestamp="2025-10-01 12:40:18 +0000 UTC" firstStartedPulling="2025-10-01 12:40:19.121904116 +0000 UTC m=+151.025379694" lastFinishedPulling="2025-10-01 12:41:06.940607045 +0000 UTC m=+198.844082653" observedRunningTime="2025-10-01 12:41:07.552413333 +0000 UTC m=+199.455888931" watchObservedRunningTime="2025-10-01 12:41:07.552946527 +0000 UTC m=+199.456422105" Oct 01 12:41:07 crc kubenswrapper[4913]: I1001 12:41:07.567378 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lsvt4" podStartSLOduration=2.644491512 podStartE2EDuration="51.567360468s" podCreationTimestamp="2025-10-01 12:40:16 +0000 UTC" firstStartedPulling="2025-10-01 12:40:18.076868635 +0000 UTC m=+149.980344213" lastFinishedPulling="2025-10-01 12:41:06.999737571 +0000 UTC m=+198.903213169" observedRunningTime="2025-10-01 12:41:07.566825373 +0000 UTC m=+199.470301121" watchObservedRunningTime="2025-10-01 12:41:07.567360468 +0000 UTC m=+199.470836046" Oct 01 12:41:07 crc kubenswrapper[4913]: I1001 12:41:07.586182 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z84sc" podStartSLOduration=2.707633914 podStartE2EDuration="49.586162662s" podCreationTimestamp="2025-10-01 12:40:18 +0000 UTC" firstStartedPulling="2025-10-01 12:40:20.138790232 +0000 UTC m=+152.042265810" lastFinishedPulling="2025-10-01 12:41:07.01731898 +0000 UTC m=+198.920794558" observedRunningTime="2025-10-01 12:41:07.5832209 +0000 UTC m=+199.486696478" watchObservedRunningTime="2025-10-01 12:41:07.586162662 +0000 UTC m=+199.489638230" Oct 01 12:41:08 crc kubenswrapper[4913]: I1001 12:41:08.398485 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:41:08 crc kubenswrapper[4913]: I1001 12:41:08.398675 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:41:08 crc kubenswrapper[4913]: I1001 12:41:08.438006 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:41:08 crc kubenswrapper[4913]: I1001 12:41:08.776546 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:41:08 crc kubenswrapper[4913]: I1001 12:41:08.776588 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:41:08 crc kubenswrapper[4913]: I1001 12:41:08.818308 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:41:10 crc kubenswrapper[4913]: I1001 12:41:10.083883 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:41:10 crc kubenswrapper[4913]: I1001 12:41:10.083937 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:41:17 crc kubenswrapper[4913]: I1001 12:41:17.022890 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:41:17 crc kubenswrapper[4913]: I1001 12:41:17.026447 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:41:17 crc kubenswrapper[4913]: I1001 12:41:17.132840 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:41:17 crc kubenswrapper[4913]: I1001 12:41:17.623818 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:41:18 crc kubenswrapper[4913]: I1001 12:41:18.276354 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lsvt4"] Oct 01 12:41:18 crc kubenswrapper[4913]: I1001 12:41:18.452249 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:41:18 crc kubenswrapper[4913]: I1001 12:41:18.819113 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:41:19 crc kubenswrapper[4913]: I1001 12:41:19.598193 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lsvt4" podUID="0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" containerName="registry-server" containerID="cri-o://311ef0b8036a904b9e274959057f5d953b1bfa7fae3d2ae3347b5b7f5a2f9337" gracePeriod=2 Oct 01 12:41:19 crc kubenswrapper[4913]: I1001 12:41:19.934523 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.070171 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-utilities\") pod \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\" (UID: \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\") " Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.070435 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58q5p\" (UniqueName: \"kubernetes.io/projected/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-kube-api-access-58q5p\") pod \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\" (UID: \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\") " Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.070625 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-catalog-content\") pod \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\" (UID: \"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67\") " Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.070973 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-utilities" (OuterVolumeSpecName: "utilities") pod "0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" (UID: "0afa2ed5-a763-46b1-8bcd-c6775b0c7b67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.074881 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.076957 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-kube-api-access-58q5p" (OuterVolumeSpecName: "kube-api-access-58q5p") pod "0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" (UID: "0afa2ed5-a763-46b1-8bcd-c6775b0c7b67"). InnerVolumeSpecName "kube-api-access-58q5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.117168 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" (UID: "0afa2ed5-a763-46b1-8bcd-c6775b0c7b67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.175552 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.175585 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58q5p\" (UniqueName: \"kubernetes.io/projected/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67-kube-api-access-58q5p\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.604509 4913 generic.go:334] "Generic (PLEG): container finished" podID="0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" containerID="311ef0b8036a904b9e274959057f5d953b1bfa7fae3d2ae3347b5b7f5a2f9337" exitCode=0 Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.604567 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsvt4" event={"ID":"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67","Type":"ContainerDied","Data":"311ef0b8036a904b9e274959057f5d953b1bfa7fae3d2ae3347b5b7f5a2f9337"} Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.604603 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsvt4" event={"ID":"0afa2ed5-a763-46b1-8bcd-c6775b0c7b67","Type":"ContainerDied","Data":"3f4eeeba8653a537b12004cc3479d096f86b825d78814162fe44ee5cb2e1ee24"} Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.604627 4913 scope.go:117] "RemoveContainer" containerID="311ef0b8036a904b9e274959057f5d953b1bfa7fae3d2ae3347b5b7f5a2f9337" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.604775 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsvt4" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.628175 4913 scope.go:117] "RemoveContainer" containerID="fbfdefcb9689c8c0370c8cee24eef6a14a9a1caf6d4e6588500c52776e9304b2" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.650701 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lsvt4"] Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.653704 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lsvt4"] Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.677174 4913 scope.go:117] "RemoveContainer" containerID="a75f08dad6c95e1ff3330f8c635d5be6c4e9884e1cbd237d241f7e515bc4f768" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.681324 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z84sc"] Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.681608 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z84sc" podUID="e6cd5196-719c-4700-9091-2a1d43574717" containerName="registry-server" containerID="cri-o://dee148e0581761b609104633d0d6e8d2d3dec3251f7d0458a1b48579981d1a7b" gracePeriod=2 Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.692335 4913 scope.go:117] "RemoveContainer" containerID="311ef0b8036a904b9e274959057f5d953b1bfa7fae3d2ae3347b5b7f5a2f9337" Oct 01 12:41:20 crc kubenswrapper[4913]: E1001 12:41:20.692723 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"311ef0b8036a904b9e274959057f5d953b1bfa7fae3d2ae3347b5b7f5a2f9337\": container with ID starting with 311ef0b8036a904b9e274959057f5d953b1bfa7fae3d2ae3347b5b7f5a2f9337 not found: ID does not exist" containerID="311ef0b8036a904b9e274959057f5d953b1bfa7fae3d2ae3347b5b7f5a2f9337" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.692767 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"311ef0b8036a904b9e274959057f5d953b1bfa7fae3d2ae3347b5b7f5a2f9337"} err="failed to get container status \"311ef0b8036a904b9e274959057f5d953b1bfa7fae3d2ae3347b5b7f5a2f9337\": rpc error: code = NotFound desc = could not find container \"311ef0b8036a904b9e274959057f5d953b1bfa7fae3d2ae3347b5b7f5a2f9337\": container with ID starting with 311ef0b8036a904b9e274959057f5d953b1bfa7fae3d2ae3347b5b7f5a2f9337 not found: ID does not exist" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.692835 4913 scope.go:117] "RemoveContainer" containerID="fbfdefcb9689c8c0370c8cee24eef6a14a9a1caf6d4e6588500c52776e9304b2" Oct 01 12:41:20 crc kubenswrapper[4913]: E1001 12:41:20.693188 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbfdefcb9689c8c0370c8cee24eef6a14a9a1caf6d4e6588500c52776e9304b2\": container with ID starting with fbfdefcb9689c8c0370c8cee24eef6a14a9a1caf6d4e6588500c52776e9304b2 not found: ID does not exist" containerID="fbfdefcb9689c8c0370c8cee24eef6a14a9a1caf6d4e6588500c52776e9304b2" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.693215 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbfdefcb9689c8c0370c8cee24eef6a14a9a1caf6d4e6588500c52776e9304b2"} err="failed to get container status \"fbfdefcb9689c8c0370c8cee24eef6a14a9a1caf6d4e6588500c52776e9304b2\": rpc error: code = NotFound desc = could not find container \"fbfdefcb9689c8c0370c8cee24eef6a14a9a1caf6d4e6588500c52776e9304b2\": container with ID starting with fbfdefcb9689c8c0370c8cee24eef6a14a9a1caf6d4e6588500c52776e9304b2 not found: ID does not exist" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.693237 4913 scope.go:117] "RemoveContainer" containerID="a75f08dad6c95e1ff3330f8c635d5be6c4e9884e1cbd237d241f7e515bc4f768" Oct 01 12:41:20 crc kubenswrapper[4913]: E1001 12:41:20.693509 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a75f08dad6c95e1ff3330f8c635d5be6c4e9884e1cbd237d241f7e515bc4f768\": container with ID starting with a75f08dad6c95e1ff3330f8c635d5be6c4e9884e1cbd237d241f7e515bc4f768 not found: ID does not exist" containerID="a75f08dad6c95e1ff3330f8c635d5be6c4e9884e1cbd237d241f7e515bc4f768" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.693529 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75f08dad6c95e1ff3330f8c635d5be6c4e9884e1cbd237d241f7e515bc4f768"} err="failed to get container status \"a75f08dad6c95e1ff3330f8c635d5be6c4e9884e1cbd237d241f7e515bc4f768\": rpc error: code = NotFound desc = could not find container \"a75f08dad6c95e1ff3330f8c635d5be6c4e9884e1cbd237d241f7e515bc4f768\": container with ID starting with a75f08dad6c95e1ff3330f8c635d5be6c4e9884e1cbd237d241f7e515bc4f768 not found: ID does not exist" Oct 01 12:41:20 crc kubenswrapper[4913]: I1001 12:41:20.814297 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" path="/var/lib/kubelet/pods/0afa2ed5-a763-46b1-8bcd-c6775b0c7b67/volumes" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.019423 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.187226 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cd5196-719c-4700-9091-2a1d43574717-catalog-content\") pod \"e6cd5196-719c-4700-9091-2a1d43574717\" (UID: \"e6cd5196-719c-4700-9091-2a1d43574717\") " Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.187351 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cd5196-719c-4700-9091-2a1d43574717-utilities\") pod \"e6cd5196-719c-4700-9091-2a1d43574717\" (UID: \"e6cd5196-719c-4700-9091-2a1d43574717\") " Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.187442 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9mkn\" (UniqueName: \"kubernetes.io/projected/e6cd5196-719c-4700-9091-2a1d43574717-kube-api-access-c9mkn\") pod \"e6cd5196-719c-4700-9091-2a1d43574717\" (UID: \"e6cd5196-719c-4700-9091-2a1d43574717\") " Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.188099 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6cd5196-719c-4700-9091-2a1d43574717-utilities" (OuterVolumeSpecName: "utilities") pod "e6cd5196-719c-4700-9091-2a1d43574717" (UID: "e6cd5196-719c-4700-9091-2a1d43574717"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.192323 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6cd5196-719c-4700-9091-2a1d43574717-kube-api-access-c9mkn" (OuterVolumeSpecName: "kube-api-access-c9mkn") pod "e6cd5196-719c-4700-9091-2a1d43574717" (UID: "e6cd5196-719c-4700-9091-2a1d43574717"). InnerVolumeSpecName "kube-api-access-c9mkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.202986 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6cd5196-719c-4700-9091-2a1d43574717-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6cd5196-719c-4700-9091-2a1d43574717" (UID: "e6cd5196-719c-4700-9091-2a1d43574717"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.288737 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cd5196-719c-4700-9091-2a1d43574717-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.288769 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9mkn\" (UniqueName: \"kubernetes.io/projected/e6cd5196-719c-4700-9091-2a1d43574717-kube-api-access-c9mkn\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.288779 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cd5196-719c-4700-9091-2a1d43574717-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.611876 4913 generic.go:334] "Generic (PLEG): container finished" podID="e6cd5196-719c-4700-9091-2a1d43574717" containerID="dee148e0581761b609104633d0d6e8d2d3dec3251f7d0458a1b48579981d1a7b" exitCode=0 Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.611930 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z84sc" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.611983 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z84sc" event={"ID":"e6cd5196-719c-4700-9091-2a1d43574717","Type":"ContainerDied","Data":"dee148e0581761b609104633d0d6e8d2d3dec3251f7d0458a1b48579981d1a7b"} Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.612044 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z84sc" event={"ID":"e6cd5196-719c-4700-9091-2a1d43574717","Type":"ContainerDied","Data":"08b24830f0b6573ece0ec120a15519e59e8b834e0e5c06cc76279d1fcc29e27b"} Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.612070 4913 scope.go:117] "RemoveContainer" containerID="dee148e0581761b609104633d0d6e8d2d3dec3251f7d0458a1b48579981d1a7b" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.629550 4913 scope.go:117] "RemoveContainer" containerID="ecab714ba3047e524b6b81c7c3aae64de6ecc88477929ab4f0185d6fc9b9cbfe" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.644666 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z84sc"] Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.648059 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z84sc"] Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.672289 4913 scope.go:117] "RemoveContainer" containerID="62c3131845c6dade192fb2da77b0ccf3a23b785bac470328148b18426eb6e783" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.687878 4913 scope.go:117] "RemoveContainer" containerID="dee148e0581761b609104633d0d6e8d2d3dec3251f7d0458a1b48579981d1a7b" Oct 01 12:41:21 crc kubenswrapper[4913]: E1001 12:41:21.688297 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dee148e0581761b609104633d0d6e8d2d3dec3251f7d0458a1b48579981d1a7b\": container with ID starting with dee148e0581761b609104633d0d6e8d2d3dec3251f7d0458a1b48579981d1a7b not found: ID does not exist" containerID="dee148e0581761b609104633d0d6e8d2d3dec3251f7d0458a1b48579981d1a7b" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.688344 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dee148e0581761b609104633d0d6e8d2d3dec3251f7d0458a1b48579981d1a7b"} err="failed to get container status \"dee148e0581761b609104633d0d6e8d2d3dec3251f7d0458a1b48579981d1a7b\": rpc error: code = NotFound desc = could not find container \"dee148e0581761b609104633d0d6e8d2d3dec3251f7d0458a1b48579981d1a7b\": container with ID starting with dee148e0581761b609104633d0d6e8d2d3dec3251f7d0458a1b48579981d1a7b not found: ID does not exist" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.688373 4913 scope.go:117] "RemoveContainer" containerID="ecab714ba3047e524b6b81c7c3aae64de6ecc88477929ab4f0185d6fc9b9cbfe" Oct 01 12:41:21 crc kubenswrapper[4913]: E1001 12:41:21.688689 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecab714ba3047e524b6b81c7c3aae64de6ecc88477929ab4f0185d6fc9b9cbfe\": container with ID starting with ecab714ba3047e524b6b81c7c3aae64de6ecc88477929ab4f0185d6fc9b9cbfe not found: ID does not exist" containerID="ecab714ba3047e524b6b81c7c3aae64de6ecc88477929ab4f0185d6fc9b9cbfe" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.688719 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecab714ba3047e524b6b81c7c3aae64de6ecc88477929ab4f0185d6fc9b9cbfe"} err="failed to get container status \"ecab714ba3047e524b6b81c7c3aae64de6ecc88477929ab4f0185d6fc9b9cbfe\": rpc error: code = NotFound desc = could not find container \"ecab714ba3047e524b6b81c7c3aae64de6ecc88477929ab4f0185d6fc9b9cbfe\": container with ID starting with ecab714ba3047e524b6b81c7c3aae64de6ecc88477929ab4f0185d6fc9b9cbfe not found: ID does not exist" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.688741 4913 scope.go:117] "RemoveContainer" containerID="62c3131845c6dade192fb2da77b0ccf3a23b785bac470328148b18426eb6e783" Oct 01 12:41:21 crc kubenswrapper[4913]: E1001 12:41:21.689008 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62c3131845c6dade192fb2da77b0ccf3a23b785bac470328148b18426eb6e783\": container with ID starting with 62c3131845c6dade192fb2da77b0ccf3a23b785bac470328148b18426eb6e783 not found: ID does not exist" containerID="62c3131845c6dade192fb2da77b0ccf3a23b785bac470328148b18426eb6e783" Oct 01 12:41:21 crc kubenswrapper[4913]: I1001 12:41:21.689031 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62c3131845c6dade192fb2da77b0ccf3a23b785bac470328148b18426eb6e783"} err="failed to get container status \"62c3131845c6dade192fb2da77b0ccf3a23b785bac470328148b18426eb6e783\": rpc error: code = NotFound desc = could not find container \"62c3131845c6dade192fb2da77b0ccf3a23b785bac470328148b18426eb6e783\": container with ID starting with 62c3131845c6dade192fb2da77b0ccf3a23b785bac470328148b18426eb6e783 not found: ID does not exist" Oct 01 12:41:22 crc kubenswrapper[4913]: I1001 12:41:22.814418 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6cd5196-719c-4700-9091-2a1d43574717" path="/var/lib/kubelet/pods/e6cd5196-719c-4700-9091-2a1d43574717/volumes" Oct 01 12:41:28 crc kubenswrapper[4913]: I1001 12:41:28.836405 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qjc8c"] Oct 01 12:41:40 crc kubenswrapper[4913]: I1001 12:41:40.083849 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:41:40 crc kubenswrapper[4913]: I1001 12:41:40.084341 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:41:40 crc kubenswrapper[4913]: I1001 12:41:40.084395 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:41:40 crc kubenswrapper[4913]: I1001 12:41:40.084922 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:41:40 crc kubenswrapper[4913]: I1001 12:41:40.084965 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a" gracePeriod=600 Oct 01 12:41:40 crc kubenswrapper[4913]: I1001 12:41:40.734958 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a" exitCode=0 Oct 01 12:41:40 crc kubenswrapper[4913]: I1001 12:41:40.735089 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a"} Oct 01 12:41:40 crc kubenswrapper[4913]: I1001 12:41:40.735561 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"96dd2b868a1064bbecfc8916fb08a36877895d66c2075b2711ca53f620f29f26"} Oct 01 12:41:53 crc kubenswrapper[4913]: I1001 12:41:53.866743 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" podUID="c349f466-f6f2-44a8-aea1-090f74dd7abe" containerName="oauth-openshift" containerID="cri-o://104b71e954b73aef09369d456f60062cab22dd19f2429c3bc4161dac370a191f" gracePeriod=15 Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.239094 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.278583 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-786b6d57dd-59f6x"] Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.278841 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" containerName="extract-utilities" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.278857 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" containerName="extract-utilities" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.278870 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" containerName="extract-content" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.278878 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" containerName="extract-content" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.278891 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6fd5584-6878-4be8-83bb-f61003df2639" containerName="collect-profiles" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.278899 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6fd5584-6878-4be8-83bb-f61003df2639" containerName="collect-profiles" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.278911 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d412bdd1-98a3-4053-a4b5-c43eff851d62" containerName="extract-utilities" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.278919 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="d412bdd1-98a3-4053-a4b5-c43eff851d62" containerName="extract-utilities" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.278930 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d412bdd1-98a3-4053-a4b5-c43eff851d62" containerName="extract-content" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.278938 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="d412bdd1-98a3-4053-a4b5-c43eff851d62" containerName="extract-content" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.278949 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" containerName="extract-utilities" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.278956 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" containerName="extract-utilities" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.278968 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d412bdd1-98a3-4053-a4b5-c43eff851d62" containerName="registry-server" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.278977 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="d412bdd1-98a3-4053-a4b5-c43eff851d62" containerName="registry-server" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.278988 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" containerName="registry-server" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.278995 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" containerName="registry-server" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.279002 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c349f466-f6f2-44a8-aea1-090f74dd7abe" containerName="oauth-openshift" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279009 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c349f466-f6f2-44a8-aea1-090f74dd7abe" containerName="oauth-openshift" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.279022 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" containerName="extract-content" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279029 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" containerName="extract-content" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.279041 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cd5196-719c-4700-9091-2a1d43574717" containerName="extract-utilities" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279049 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cd5196-719c-4700-9091-2a1d43574717" containerName="extract-utilities" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.279058 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" containerName="registry-server" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279067 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" containerName="registry-server" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.279078 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc86b44-cac0-4c7d-bae3-a6d8890283dc" containerName="pruner" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279085 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc86b44-cac0-4c7d-bae3-a6d8890283dc" containerName="pruner" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.279093 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cd5196-719c-4700-9091-2a1d43574717" containerName="extract-content" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279100 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cd5196-719c-4700-9091-2a1d43574717" containerName="extract-content" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.279110 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa48fe37-156a-4df2-b8c1-6d07961d14d3" containerName="pruner" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279116 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa48fe37-156a-4df2-b8c1-6d07961d14d3" containerName="pruner" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.279122 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cd5196-719c-4700-9091-2a1d43574717" containerName="registry-server" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279128 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cd5196-719c-4700-9091-2a1d43574717" containerName="registry-server" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279214 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6cd5196-719c-4700-9091-2a1d43574717" containerName="registry-server" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279223 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6fd5584-6878-4be8-83bb-f61003df2639" containerName="collect-profiles" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279233 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="d412bdd1-98a3-4053-a4b5-c43eff851d62" containerName="registry-server" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279241 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3b1fcec-776e-4efd-987a-8b6d5cb4b7d7" containerName="registry-server" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279249 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc86b44-cac0-4c7d-bae3-a6d8890283dc" containerName="pruner" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279255 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c349f466-f6f2-44a8-aea1-090f74dd7abe" containerName="oauth-openshift" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279263 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="0afa2ed5-a763-46b1-8bcd-c6775b0c7b67" containerName="registry-server" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.279290 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa48fe37-156a-4df2-b8c1-6d07961d14d3" containerName="pruner" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.282520 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.293648 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-786b6d57dd-59f6x"] Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.398899 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-session\") pod \"c349f466-f6f2-44a8-aea1-090f74dd7abe\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.398945 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c349f466-f6f2-44a8-aea1-090f74dd7abe-audit-dir\") pod \"c349f466-f6f2-44a8-aea1-090f74dd7abe\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399002 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppwgf\" (UniqueName: \"kubernetes.io/projected/c349f466-f6f2-44a8-aea1-090f74dd7abe-kube-api-access-ppwgf\") pod \"c349f466-f6f2-44a8-aea1-090f74dd7abe\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399029 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-serving-cert\") pod \"c349f466-f6f2-44a8-aea1-090f74dd7abe\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399062 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-trusted-ca-bundle\") pod \"c349f466-f6f2-44a8-aea1-090f74dd7abe\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399086 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-router-certs\") pod \"c349f466-f6f2-44a8-aea1-090f74dd7abe\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399111 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-service-ca\") pod \"c349f466-f6f2-44a8-aea1-090f74dd7abe\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399136 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-login\") pod \"c349f466-f6f2-44a8-aea1-090f74dd7abe\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399387 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-error\") pod \"c349f466-f6f2-44a8-aea1-090f74dd7abe\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399478 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-provider-selection\") pod \"c349f466-f6f2-44a8-aea1-090f74dd7abe\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399532 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-ocp-branding-template\") pod \"c349f466-f6f2-44a8-aea1-090f74dd7abe\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399570 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-audit-policies\") pod \"c349f466-f6f2-44a8-aea1-090f74dd7abe\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399601 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-cliconfig\") pod \"c349f466-f6f2-44a8-aea1-090f74dd7abe\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399634 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-idp-0-file-data\") pod \"c349f466-f6f2-44a8-aea1-090f74dd7abe\" (UID: \"c349f466-f6f2-44a8-aea1-090f74dd7abe\") " Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399840 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c349f466-f6f2-44a8-aea1-090f74dd7abe" (UID: "c349f466-f6f2-44a8-aea1-090f74dd7abe"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399849 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399939 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-audit-policies\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399965 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zckdb\" (UniqueName: \"kubernetes.io/projected/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-kube-api-access-zckdb\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399984 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c349f466-f6f2-44a8-aea1-090f74dd7abe-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c349f466-f6f2-44a8-aea1-090f74dd7abe" (UID: "c349f466-f6f2-44a8-aea1-090f74dd7abe"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.399999 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-audit-dir\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400128 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400165 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-router-certs\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400192 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c349f466-f6f2-44a8-aea1-090f74dd7abe" (UID: "c349f466-f6f2-44a8-aea1-090f74dd7abe"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400206 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400308 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-service-ca\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400340 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400356 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c349f466-f6f2-44a8-aea1-090f74dd7abe" (UID: "c349f466-f6f2-44a8-aea1-090f74dd7abe"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400370 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400412 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-user-template-login\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400491 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-session\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400535 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400560 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c349f466-f6f2-44a8-aea1-090f74dd7abe" (UID: "c349f466-f6f2-44a8-aea1-090f74dd7abe"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400568 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-user-template-error\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400644 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400660 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400675 4913 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400719 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.400732 4913 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c349f466-f6f2-44a8-aea1-090f74dd7abe-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.405329 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c349f466-f6f2-44a8-aea1-090f74dd7abe" (UID: "c349f466-f6f2-44a8-aea1-090f74dd7abe"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.405402 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c349f466-f6f2-44a8-aea1-090f74dd7abe-kube-api-access-ppwgf" (OuterVolumeSpecName: "kube-api-access-ppwgf") pod "c349f466-f6f2-44a8-aea1-090f74dd7abe" (UID: "c349f466-f6f2-44a8-aea1-090f74dd7abe"). InnerVolumeSpecName "kube-api-access-ppwgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.405710 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c349f466-f6f2-44a8-aea1-090f74dd7abe" (UID: "c349f466-f6f2-44a8-aea1-090f74dd7abe"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.407194 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c349f466-f6f2-44a8-aea1-090f74dd7abe" (UID: "c349f466-f6f2-44a8-aea1-090f74dd7abe"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.407297 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c349f466-f6f2-44a8-aea1-090f74dd7abe" (UID: "c349f466-f6f2-44a8-aea1-090f74dd7abe"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.407508 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c349f466-f6f2-44a8-aea1-090f74dd7abe" (UID: "c349f466-f6f2-44a8-aea1-090f74dd7abe"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.407673 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c349f466-f6f2-44a8-aea1-090f74dd7abe" (UID: "c349f466-f6f2-44a8-aea1-090f74dd7abe"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.411866 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c349f466-f6f2-44a8-aea1-090f74dd7abe" (UID: "c349f466-f6f2-44a8-aea1-090f74dd7abe"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.412185 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c349f466-f6f2-44a8-aea1-090f74dd7abe" (UID: "c349f466-f6f2-44a8-aea1-090f74dd7abe"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.501585 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-service-ca\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.502533 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.502585 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.502620 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-user-template-login\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.502654 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-session\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.502703 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.502734 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-user-template-error\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.502778 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.502860 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zckdb\" (UniqueName: \"kubernetes.io/projected/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-kube-api-access-zckdb\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.502886 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-audit-policies\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.502921 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-audit-dir\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.502918 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-service-ca\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.503029 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-audit-dir\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.503049 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.503089 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-router-certs\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.503137 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.503245 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.503311 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.503333 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.503352 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.503371 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppwgf\" (UniqueName: \"kubernetes.io/projected/c349f466-f6f2-44a8-aea1-090f74dd7abe-kube-api-access-ppwgf\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.503390 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.503410 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.503431 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.503451 4913 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c349f466-f6f2-44a8-aea1-090f74dd7abe-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.503590 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.504809 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-audit-policies\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.505378 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.505970 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.506506 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-user-template-error\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.507430 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-router-certs\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.508028 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-user-template-login\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.508548 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.508589 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.509114 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.509614 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-v4-0-config-system-session\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.522206 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zckdb\" (UniqueName: \"kubernetes.io/projected/b0cb7d1a-4457-452d-9b4c-6b4c636e704d-kube-api-access-zckdb\") pod \"oauth-openshift-786b6d57dd-59f6x\" (UID: \"b0cb7d1a-4457-452d-9b4c-6b4c636e704d\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.610822 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.824323 4913 generic.go:334] "Generic (PLEG): container finished" podID="c349f466-f6f2-44a8-aea1-090f74dd7abe" containerID="104b71e954b73aef09369d456f60062cab22dd19f2429c3bc4161dac370a191f" exitCode=0 Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.824397 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" event={"ID":"c349f466-f6f2-44a8-aea1-090f74dd7abe","Type":"ContainerDied","Data":"104b71e954b73aef09369d456f60062cab22dd19f2429c3bc4161dac370a191f"} Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.824720 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" event={"ID":"c349f466-f6f2-44a8-aea1-090f74dd7abe","Type":"ContainerDied","Data":"d0cb582dd46430ff2da43d850435f9e5dbe11801279118e90c12c5f24c79f5c3"} Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.824760 4913 scope.go:117] "RemoveContainer" containerID="104b71e954b73aef09369d456f60062cab22dd19f2429c3bc4161dac370a191f" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.824425 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qjc8c" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.849286 4913 scope.go:117] "RemoveContainer" containerID="104b71e954b73aef09369d456f60062cab22dd19f2429c3bc4161dac370a191f" Oct 01 12:41:54 crc kubenswrapper[4913]: E1001 12:41:54.849811 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"104b71e954b73aef09369d456f60062cab22dd19f2429c3bc4161dac370a191f\": container with ID starting with 104b71e954b73aef09369d456f60062cab22dd19f2429c3bc4161dac370a191f not found: ID does not exist" containerID="104b71e954b73aef09369d456f60062cab22dd19f2429c3bc4161dac370a191f" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.849861 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104b71e954b73aef09369d456f60062cab22dd19f2429c3bc4161dac370a191f"} err="failed to get container status \"104b71e954b73aef09369d456f60062cab22dd19f2429c3bc4161dac370a191f\": rpc error: code = NotFound desc = could not find container \"104b71e954b73aef09369d456f60062cab22dd19f2429c3bc4161dac370a191f\": container with ID starting with 104b71e954b73aef09369d456f60062cab22dd19f2429c3bc4161dac370a191f not found: ID does not exist" Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.851710 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qjc8c"] Oct 01 12:41:54 crc kubenswrapper[4913]: I1001 12:41:54.854789 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qjc8c"] Oct 01 12:41:55 crc kubenswrapper[4913]: I1001 12:41:55.017627 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-786b6d57dd-59f6x"] Oct 01 12:41:55 crc kubenswrapper[4913]: W1001 12:41:55.019236 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0cb7d1a_4457_452d_9b4c_6b4c636e704d.slice/crio-4f9209de0ef699501bda69e0df8380792091d7b1f2e68b324ae18ac4e0a63c58 WatchSource:0}: Error finding container 4f9209de0ef699501bda69e0df8380792091d7b1f2e68b324ae18ac4e0a63c58: Status 404 returned error can't find the container with id 4f9209de0ef699501bda69e0df8380792091d7b1f2e68b324ae18ac4e0a63c58 Oct 01 12:41:55 crc kubenswrapper[4913]: I1001 12:41:55.837223 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" event={"ID":"b0cb7d1a-4457-452d-9b4c-6b4c636e704d","Type":"ContainerStarted","Data":"dd822d10484516f908d8443b417da85f4a365d10a1bdde4cbb06ec78e5386fec"} Oct 01 12:41:55 crc kubenswrapper[4913]: I1001 12:41:55.837634 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:55 crc kubenswrapper[4913]: I1001 12:41:55.837654 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" event={"ID":"b0cb7d1a-4457-452d-9b4c-6b4c636e704d","Type":"ContainerStarted","Data":"4f9209de0ef699501bda69e0df8380792091d7b1f2e68b324ae18ac4e0a63c58"} Oct 01 12:41:55 crc kubenswrapper[4913]: I1001 12:41:55.844649 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" Oct 01 12:41:55 crc kubenswrapper[4913]: I1001 12:41:55.867946 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-786b6d57dd-59f6x" podStartSLOduration=27.867918991 podStartE2EDuration="27.867918991s" podCreationTimestamp="2025-10-01 12:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:41:55.864326252 +0000 UTC m=+247.767801880" watchObservedRunningTime="2025-10-01 12:41:55.867918991 +0000 UTC m=+247.771394589" Oct 01 12:41:56 crc kubenswrapper[4913]: I1001 12:41:56.818799 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c349f466-f6f2-44a8-aea1-090f74dd7abe" path="/var/lib/kubelet/pods/c349f466-f6f2-44a8-aea1-090f74dd7abe/volumes" Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.792763 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qrj6h"] Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.793314 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qrj6h" podUID="89fe354d-c11c-4c4f-a2c8-309d9da44911" containerName="registry-server" containerID="cri-o://a0e4d9a12c449a729d7c2a65d470883b80655598903d0e12f154aac1a0e46af3" gracePeriod=30 Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.797378 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4k6h5"] Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.797680 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4k6h5" podUID="9a5b3988-8324-4401-b951-3b1e3fea763a" containerName="registry-server" containerID="cri-o://9d88b9cedd658546c8a390a3d224c107b790388b961e27026f428156c4d4fd9c" gracePeriod=30 Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.818475 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vbrpv"] Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.819521 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" podUID="ceea773f-549c-4d23-841c-a8e2ccb62f28" containerName="marketplace-operator" containerID="cri-o://7c88b94b7679abe8863128d1e4ffedd7d0c79648ad02c0fee0c5a243316e4b86" gracePeriod=30 Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.828373 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9chfh"] Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.828610 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9chfh" podUID="de27ba3c-2707-4ab6-827e-b9d58f8968da" containerName="registry-server" containerID="cri-o://7b2a098cf1ccaf3a51083a01be58f77eb0606e21a5c2bc10772307f67912e20a" gracePeriod=30 Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.832727 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgpws"] Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.833488 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.854694 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngwdj"] Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.854968 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ngwdj" podUID="2a7bb5ac-942e-4b86-ad14-e6dd271d725a" containerName="registry-server" containerID="cri-o://99b33aaca64990db7675d0bdfb78859f629dd14144e637cfaca966e1dbb98091" gracePeriod=30 Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.856238 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgpws"] Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.952383 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a6a11a5-0c37-4537-9e97-4ef59ad7bc38-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xgpws\" (UID: \"6a6a11a5-0c37-4537-9e97-4ef59ad7bc38\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.952466 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6a6a11a5-0c37-4537-9e97-4ef59ad7bc38-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xgpws\" (UID: \"6a6a11a5-0c37-4537-9e97-4ef59ad7bc38\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" Oct 01 12:42:06 crc kubenswrapper[4913]: I1001 12:42:06.952557 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j62w4\" (UniqueName: \"kubernetes.io/projected/6a6a11a5-0c37-4537-9e97-4ef59ad7bc38-kube-api-access-j62w4\") pod \"marketplace-operator-79b997595-xgpws\" (UID: \"6a6a11a5-0c37-4537-9e97-4ef59ad7bc38\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.054129 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a6a11a5-0c37-4537-9e97-4ef59ad7bc38-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xgpws\" (UID: \"6a6a11a5-0c37-4537-9e97-4ef59ad7bc38\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.054174 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6a6a11a5-0c37-4537-9e97-4ef59ad7bc38-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xgpws\" (UID: \"6a6a11a5-0c37-4537-9e97-4ef59ad7bc38\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.054205 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j62w4\" (UniqueName: \"kubernetes.io/projected/6a6a11a5-0c37-4537-9e97-4ef59ad7bc38-kube-api-access-j62w4\") pod \"marketplace-operator-79b997595-xgpws\" (UID: \"6a6a11a5-0c37-4537-9e97-4ef59ad7bc38\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.056020 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a6a11a5-0c37-4537-9e97-4ef59ad7bc38-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xgpws\" (UID: \"6a6a11a5-0c37-4537-9e97-4ef59ad7bc38\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.060020 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6a6a11a5-0c37-4537-9e97-4ef59ad7bc38-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xgpws\" (UID: \"6a6a11a5-0c37-4537-9e97-4ef59ad7bc38\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.072407 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j62w4\" (UniqueName: \"kubernetes.io/projected/6a6a11a5-0c37-4537-9e97-4ef59ad7bc38-kube-api-access-j62w4\") pod \"marketplace-operator-79b997595-xgpws\" (UID: \"6a6a11a5-0c37-4537-9e97-4ef59ad7bc38\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.236356 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.242739 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.246885 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.253680 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.268574 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.345184 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.366388 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ceea773f-549c-4d23-841c-a8e2ccb62f28-marketplace-trusted-ca\") pod \"ceea773f-549c-4d23-841c-a8e2ccb62f28\" (UID: \"ceea773f-549c-4d23-841c-a8e2ccb62f28\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.366422 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdq95\" (UniqueName: \"kubernetes.io/projected/89fe354d-c11c-4c4f-a2c8-309d9da44911-kube-api-access-zdq95\") pod \"89fe354d-c11c-4c4f-a2c8-309d9da44911\" (UID: \"89fe354d-c11c-4c4f-a2c8-309d9da44911\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.366476 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a5b3988-8324-4401-b951-3b1e3fea763a-utilities\") pod \"9a5b3988-8324-4401-b951-3b1e3fea763a\" (UID: \"9a5b3988-8324-4401-b951-3b1e3fea763a\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.366510 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn6nd\" (UniqueName: \"kubernetes.io/projected/de27ba3c-2707-4ab6-827e-b9d58f8968da-kube-api-access-cn6nd\") pod \"de27ba3c-2707-4ab6-827e-b9d58f8968da\" (UID: \"de27ba3c-2707-4ab6-827e-b9d58f8968da\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.367020 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de27ba3c-2707-4ab6-827e-b9d58f8968da-utilities\") pod \"de27ba3c-2707-4ab6-827e-b9d58f8968da\" (UID: \"de27ba3c-2707-4ab6-827e-b9d58f8968da\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.367077 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czrkx\" (UniqueName: \"kubernetes.io/projected/9a5b3988-8324-4401-b951-3b1e3fea763a-kube-api-access-czrkx\") pod \"9a5b3988-8324-4401-b951-3b1e3fea763a\" (UID: \"9a5b3988-8324-4401-b951-3b1e3fea763a\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.367094 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89fe354d-c11c-4c4f-a2c8-309d9da44911-catalog-content\") pod \"89fe354d-c11c-4c4f-a2c8-309d9da44911\" (UID: \"89fe354d-c11c-4c4f-a2c8-309d9da44911\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.367109 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a5b3988-8324-4401-b951-3b1e3fea763a-catalog-content\") pod \"9a5b3988-8324-4401-b951-3b1e3fea763a\" (UID: \"9a5b3988-8324-4401-b951-3b1e3fea763a\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.367130 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pghvl\" (UniqueName: \"kubernetes.io/projected/ceea773f-549c-4d23-841c-a8e2ccb62f28-kube-api-access-pghvl\") pod \"ceea773f-549c-4d23-841c-a8e2ccb62f28\" (UID: \"ceea773f-549c-4d23-841c-a8e2ccb62f28\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.367155 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ceea773f-549c-4d23-841c-a8e2ccb62f28-marketplace-operator-metrics\") pod \"ceea773f-549c-4d23-841c-a8e2ccb62f28\" (UID: \"ceea773f-549c-4d23-841c-a8e2ccb62f28\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.367181 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89fe354d-c11c-4c4f-a2c8-309d9da44911-utilities\") pod \"89fe354d-c11c-4c4f-a2c8-309d9da44911\" (UID: \"89fe354d-c11c-4c4f-a2c8-309d9da44911\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.367205 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de27ba3c-2707-4ab6-827e-b9d58f8968da-catalog-content\") pod \"de27ba3c-2707-4ab6-827e-b9d58f8968da\" (UID: \"de27ba3c-2707-4ab6-827e-b9d58f8968da\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.370386 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89fe354d-c11c-4c4f-a2c8-309d9da44911-kube-api-access-zdq95" (OuterVolumeSpecName: "kube-api-access-zdq95") pod "89fe354d-c11c-4c4f-a2c8-309d9da44911" (UID: "89fe354d-c11c-4c4f-a2c8-309d9da44911"). InnerVolumeSpecName "kube-api-access-zdq95". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.371180 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89fe354d-c11c-4c4f-a2c8-309d9da44911-utilities" (OuterVolumeSpecName: "utilities") pod "89fe354d-c11c-4c4f-a2c8-309d9da44911" (UID: "89fe354d-c11c-4c4f-a2c8-309d9da44911"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.371624 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de27ba3c-2707-4ab6-827e-b9d58f8968da-utilities" (OuterVolumeSpecName: "utilities") pod "de27ba3c-2707-4ab6-827e-b9d58f8968da" (UID: "de27ba3c-2707-4ab6-827e-b9d58f8968da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.371896 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceea773f-549c-4d23-841c-a8e2ccb62f28-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ceea773f-549c-4d23-841c-a8e2ccb62f28" (UID: "ceea773f-549c-4d23-841c-a8e2ccb62f28"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.372824 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a5b3988-8324-4401-b951-3b1e3fea763a-utilities" (OuterVolumeSpecName: "utilities") pod "9a5b3988-8324-4401-b951-3b1e3fea763a" (UID: "9a5b3988-8324-4401-b951-3b1e3fea763a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.374248 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceea773f-549c-4d23-841c-a8e2ccb62f28-kube-api-access-pghvl" (OuterVolumeSpecName: "kube-api-access-pghvl") pod "ceea773f-549c-4d23-841c-a8e2ccb62f28" (UID: "ceea773f-549c-4d23-841c-a8e2ccb62f28"). InnerVolumeSpecName "kube-api-access-pghvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.374878 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a5b3988-8324-4401-b951-3b1e3fea763a-kube-api-access-czrkx" (OuterVolumeSpecName: "kube-api-access-czrkx") pod "9a5b3988-8324-4401-b951-3b1e3fea763a" (UID: "9a5b3988-8324-4401-b951-3b1e3fea763a"). InnerVolumeSpecName "kube-api-access-czrkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.376884 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceea773f-549c-4d23-841c-a8e2ccb62f28-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ceea773f-549c-4d23-841c-a8e2ccb62f28" (UID: "ceea773f-549c-4d23-841c-a8e2ccb62f28"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.386725 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de27ba3c-2707-4ab6-827e-b9d58f8968da-kube-api-access-cn6nd" (OuterVolumeSpecName: "kube-api-access-cn6nd") pod "de27ba3c-2707-4ab6-827e-b9d58f8968da" (UID: "de27ba3c-2707-4ab6-827e-b9d58f8968da"). InnerVolumeSpecName "kube-api-access-cn6nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.402440 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de27ba3c-2707-4ab6-827e-b9d58f8968da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de27ba3c-2707-4ab6-827e-b9d58f8968da" (UID: "de27ba3c-2707-4ab6-827e-b9d58f8968da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.418986 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89fe354d-c11c-4c4f-a2c8-309d9da44911-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89fe354d-c11c-4c4f-a2c8-309d9da44911" (UID: "89fe354d-c11c-4c4f-a2c8-309d9da44911"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.422110 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a5b3988-8324-4401-b951-3b1e3fea763a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a5b3988-8324-4401-b951-3b1e3fea763a" (UID: "9a5b3988-8324-4401-b951-3b1e3fea763a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468118 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb4b8\" (UniqueName: \"kubernetes.io/projected/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-kube-api-access-gb4b8\") pod \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\" (UID: \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468175 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-utilities\") pod \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\" (UID: \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468287 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-catalog-content\") pod \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\" (UID: \"2a7bb5ac-942e-4b86-ad14-e6dd271d725a\") " Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468555 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pghvl\" (UniqueName: \"kubernetes.io/projected/ceea773f-549c-4d23-841c-a8e2ccb62f28-kube-api-access-pghvl\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468568 4913 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ceea773f-549c-4d23-841c-a8e2ccb62f28-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468579 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89fe354d-c11c-4c4f-a2c8-309d9da44911-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468588 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de27ba3c-2707-4ab6-827e-b9d58f8968da-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468616 4913 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ceea773f-549c-4d23-841c-a8e2ccb62f28-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468627 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdq95\" (UniqueName: \"kubernetes.io/projected/89fe354d-c11c-4c4f-a2c8-309d9da44911-kube-api-access-zdq95\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468636 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a5b3988-8324-4401-b951-3b1e3fea763a-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468647 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn6nd\" (UniqueName: \"kubernetes.io/projected/de27ba3c-2707-4ab6-827e-b9d58f8968da-kube-api-access-cn6nd\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468655 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de27ba3c-2707-4ab6-827e-b9d58f8968da-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468664 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czrkx\" (UniqueName: \"kubernetes.io/projected/9a5b3988-8324-4401-b951-3b1e3fea763a-kube-api-access-czrkx\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468672 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89fe354d-c11c-4c4f-a2c8-309d9da44911-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.468699 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a5b3988-8324-4401-b951-3b1e3fea763a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.469065 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-utilities" (OuterVolumeSpecName: "utilities") pod "2a7bb5ac-942e-4b86-ad14-e6dd271d725a" (UID: "2a7bb5ac-942e-4b86-ad14-e6dd271d725a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.470503 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-kube-api-access-gb4b8" (OuterVolumeSpecName: "kube-api-access-gb4b8") pod "2a7bb5ac-942e-4b86-ad14-e6dd271d725a" (UID: "2a7bb5ac-942e-4b86-ad14-e6dd271d725a"). InnerVolumeSpecName "kube-api-access-gb4b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.561491 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a7bb5ac-942e-4b86-ad14-e6dd271d725a" (UID: "2a7bb5ac-942e-4b86-ad14-e6dd271d725a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.569714 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.569752 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb4b8\" (UniqueName: \"kubernetes.io/projected/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-kube-api-access-gb4b8\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.569767 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7bb5ac-942e-4b86-ad14-e6dd271d725a-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.688524 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgpws"] Oct 01 12:42:07 crc kubenswrapper[4913]: W1001 12:42:07.692602 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a6a11a5_0c37_4537_9e97_4ef59ad7bc38.slice/crio-961dcf4387671510ac62315683073e5af49059d80730c7a68e21fc978e093078 WatchSource:0}: Error finding container 961dcf4387671510ac62315683073e5af49059d80730c7a68e21fc978e093078: Status 404 returned error can't find the container with id 961dcf4387671510ac62315683073e5af49059d80730c7a68e21fc978e093078 Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.917611 4913 generic.go:334] "Generic (PLEG): container finished" podID="de27ba3c-2707-4ab6-827e-b9d58f8968da" containerID="7b2a098cf1ccaf3a51083a01be58f77eb0606e21a5c2bc10772307f67912e20a" exitCode=0 Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.917665 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9chfh" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.917742 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9chfh" event={"ID":"de27ba3c-2707-4ab6-827e-b9d58f8968da","Type":"ContainerDied","Data":"7b2a098cf1ccaf3a51083a01be58f77eb0606e21a5c2bc10772307f67912e20a"} Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.917815 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9chfh" event={"ID":"de27ba3c-2707-4ab6-827e-b9d58f8968da","Type":"ContainerDied","Data":"69a28e9777b605d768f3a97d634e1b6c2dcf23ede2f2d8186282bf4908dcdb7e"} Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.917844 4913 scope.go:117] "RemoveContainer" containerID="7b2a098cf1ccaf3a51083a01be58f77eb0606e21a5c2bc10772307f67912e20a" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.920030 4913 generic.go:334] "Generic (PLEG): container finished" podID="ceea773f-549c-4d23-841c-a8e2ccb62f28" containerID="7c88b94b7679abe8863128d1e4ffedd7d0c79648ad02c0fee0c5a243316e4b86" exitCode=0 Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.920109 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.920214 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" event={"ID":"ceea773f-549c-4d23-841c-a8e2ccb62f28","Type":"ContainerDied","Data":"7c88b94b7679abe8863128d1e4ffedd7d0c79648ad02c0fee0c5a243316e4b86"} Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.920419 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vbrpv" event={"ID":"ceea773f-549c-4d23-841c-a8e2ccb62f28","Type":"ContainerDied","Data":"5e266b525fb8c5a8e305bf7d615da96d3df3314f50864d77660241297e14c910"} Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.922245 4913 generic.go:334] "Generic (PLEG): container finished" podID="9a5b3988-8324-4401-b951-3b1e3fea763a" containerID="9d88b9cedd658546c8a390a3d224c107b790388b961e27026f428156c4d4fd9c" exitCode=0 Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.922317 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k6h5" event={"ID":"9a5b3988-8324-4401-b951-3b1e3fea763a","Type":"ContainerDied","Data":"9d88b9cedd658546c8a390a3d224c107b790388b961e27026f428156c4d4fd9c"} Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.922344 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k6h5" event={"ID":"9a5b3988-8324-4401-b951-3b1e3fea763a","Type":"ContainerDied","Data":"e8cb1c5e9305ccdead244dcf4e63665ca74f45b021dea1e405d277d6d0ca5a6d"} Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.922405 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4k6h5" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.928242 4913 generic.go:334] "Generic (PLEG): container finished" podID="2a7bb5ac-942e-4b86-ad14-e6dd271d725a" containerID="99b33aaca64990db7675d0bdfb78859f629dd14144e637cfaca966e1dbb98091" exitCode=0 Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.928343 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngwdj" event={"ID":"2a7bb5ac-942e-4b86-ad14-e6dd271d725a","Type":"ContainerDied","Data":"99b33aaca64990db7675d0bdfb78859f629dd14144e637cfaca966e1dbb98091"} Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.928356 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngwdj" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.928367 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngwdj" event={"ID":"2a7bb5ac-942e-4b86-ad14-e6dd271d725a","Type":"ContainerDied","Data":"5852d404ef1db53c5269f10d6bdfb77c90f8bb7ffa430bdca1eb2f9292a73f74"} Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.936646 4913 generic.go:334] "Generic (PLEG): container finished" podID="89fe354d-c11c-4c4f-a2c8-309d9da44911" containerID="a0e4d9a12c449a729d7c2a65d470883b80655598903d0e12f154aac1a0e46af3" exitCode=0 Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.936758 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrj6h" event={"ID":"89fe354d-c11c-4c4f-a2c8-309d9da44911","Type":"ContainerDied","Data":"a0e4d9a12c449a729d7c2a65d470883b80655598903d0e12f154aac1a0e46af3"} Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.936841 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrj6h" event={"ID":"89fe354d-c11c-4c4f-a2c8-309d9da44911","Type":"ContainerDied","Data":"bf0f1d6972bdf3fcc71fbe42d537d0074b2fa0a9fc2cff4b24ed19e2b39d9906"} Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.936975 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrj6h" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.943905 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" event={"ID":"6a6a11a5-0c37-4537-9e97-4ef59ad7bc38","Type":"ContainerStarted","Data":"5056ec6352ca66d06c5751b3032a3645b995766b6c6ca2d759d4032c74a7af4f"} Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.943943 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" event={"ID":"6a6a11a5-0c37-4537-9e97-4ef59ad7bc38","Type":"ContainerStarted","Data":"961dcf4387671510ac62315683073e5af49059d80730c7a68e21fc978e093078"} Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.948061 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.948219 4913 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xgpws container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.948261 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" podUID="6a6a11a5-0c37-4537-9e97-4ef59ad7bc38" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.967407 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" podStartSLOduration=1.967350887 podStartE2EDuration="1.967350887s" podCreationTimestamp="2025-10-01 12:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:42:07.962822082 +0000 UTC m=+259.866297670" watchObservedRunningTime="2025-10-01 12:42:07.967350887 +0000 UTC m=+259.870826475" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.970610 4913 scope.go:117] "RemoveContainer" containerID="a1bbeb3f808c49b76eb8d458723f010c495132e001f6fd9b84cfd33dfaeb8f97" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.992804 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngwdj"] Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.997822 4913 scope.go:117] "RemoveContainer" containerID="508d13c6118b8bf081b6b8849a163de62d8e88c331d8d63763aaa038fddf7b37" Oct 01 12:42:07 crc kubenswrapper[4913]: I1001 12:42:07.999593 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ngwdj"] Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.024794 4913 scope.go:117] "RemoveContainer" containerID="7b2a098cf1ccaf3a51083a01be58f77eb0606e21a5c2bc10772307f67912e20a" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.026136 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2a098cf1ccaf3a51083a01be58f77eb0606e21a5c2bc10772307f67912e20a\": container with ID starting with 7b2a098cf1ccaf3a51083a01be58f77eb0606e21a5c2bc10772307f67912e20a not found: ID does not exist" containerID="7b2a098cf1ccaf3a51083a01be58f77eb0606e21a5c2bc10772307f67912e20a" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.026191 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2a098cf1ccaf3a51083a01be58f77eb0606e21a5c2bc10772307f67912e20a"} err="failed to get container status \"7b2a098cf1ccaf3a51083a01be58f77eb0606e21a5c2bc10772307f67912e20a\": rpc error: code = NotFound desc = could not find container \"7b2a098cf1ccaf3a51083a01be58f77eb0606e21a5c2bc10772307f67912e20a\": container with ID starting with 7b2a098cf1ccaf3a51083a01be58f77eb0606e21a5c2bc10772307f67912e20a not found: ID does not exist" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.026217 4913 scope.go:117] "RemoveContainer" containerID="a1bbeb3f808c49b76eb8d458723f010c495132e001f6fd9b84cfd33dfaeb8f97" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.026492 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vbrpv"] Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.026769 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1bbeb3f808c49b76eb8d458723f010c495132e001f6fd9b84cfd33dfaeb8f97\": container with ID starting with a1bbeb3f808c49b76eb8d458723f010c495132e001f6fd9b84cfd33dfaeb8f97 not found: ID does not exist" containerID="a1bbeb3f808c49b76eb8d458723f010c495132e001f6fd9b84cfd33dfaeb8f97" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.026874 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1bbeb3f808c49b76eb8d458723f010c495132e001f6fd9b84cfd33dfaeb8f97"} err="failed to get container status \"a1bbeb3f808c49b76eb8d458723f010c495132e001f6fd9b84cfd33dfaeb8f97\": rpc error: code = NotFound desc = could not find container \"a1bbeb3f808c49b76eb8d458723f010c495132e001f6fd9b84cfd33dfaeb8f97\": container with ID starting with a1bbeb3f808c49b76eb8d458723f010c495132e001f6fd9b84cfd33dfaeb8f97 not found: ID does not exist" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.026910 4913 scope.go:117] "RemoveContainer" containerID="508d13c6118b8bf081b6b8849a163de62d8e88c331d8d63763aaa038fddf7b37" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.027300 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"508d13c6118b8bf081b6b8849a163de62d8e88c331d8d63763aaa038fddf7b37\": container with ID starting with 508d13c6118b8bf081b6b8849a163de62d8e88c331d8d63763aaa038fddf7b37 not found: ID does not exist" containerID="508d13c6118b8bf081b6b8849a163de62d8e88c331d8d63763aaa038fddf7b37" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.027463 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"508d13c6118b8bf081b6b8849a163de62d8e88c331d8d63763aaa038fddf7b37"} err="failed to get container status \"508d13c6118b8bf081b6b8849a163de62d8e88c331d8d63763aaa038fddf7b37\": rpc error: code = NotFound desc = could not find container \"508d13c6118b8bf081b6b8849a163de62d8e88c331d8d63763aaa038fddf7b37\": container with ID starting with 508d13c6118b8bf081b6b8849a163de62d8e88c331d8d63763aaa038fddf7b37 not found: ID does not exist" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.027494 4913 scope.go:117] "RemoveContainer" containerID="7c88b94b7679abe8863128d1e4ffedd7d0c79648ad02c0fee0c5a243316e4b86" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.036222 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vbrpv"] Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.040116 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4k6h5"] Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.043456 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4k6h5"] Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.048379 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qrj6h"] Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.049611 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qrj6h"] Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.052304 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9chfh"] Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.052485 4913 scope.go:117] "RemoveContainer" containerID="7c88b94b7679abe8863128d1e4ffedd7d0c79648ad02c0fee0c5a243316e4b86" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.052933 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c88b94b7679abe8863128d1e4ffedd7d0c79648ad02c0fee0c5a243316e4b86\": container with ID starting with 7c88b94b7679abe8863128d1e4ffedd7d0c79648ad02c0fee0c5a243316e4b86 not found: ID does not exist" containerID="7c88b94b7679abe8863128d1e4ffedd7d0c79648ad02c0fee0c5a243316e4b86" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.052969 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c88b94b7679abe8863128d1e4ffedd7d0c79648ad02c0fee0c5a243316e4b86"} err="failed to get container status \"7c88b94b7679abe8863128d1e4ffedd7d0c79648ad02c0fee0c5a243316e4b86\": rpc error: code = NotFound desc = could not find container \"7c88b94b7679abe8863128d1e4ffedd7d0c79648ad02c0fee0c5a243316e4b86\": container with ID starting with 7c88b94b7679abe8863128d1e4ffedd7d0c79648ad02c0fee0c5a243316e4b86 not found: ID does not exist" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.053005 4913 scope.go:117] "RemoveContainer" containerID="9d88b9cedd658546c8a390a3d224c107b790388b961e27026f428156c4d4fd9c" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.055131 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9chfh"] Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.069074 4913 scope.go:117] "RemoveContainer" containerID="0b47094481fdeedd251ea2178fea3907ee0401261fe356c8e890888e0c37506f" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.086772 4913 scope.go:117] "RemoveContainer" containerID="56372340530d13dafd752b211a37f1ef29fd0b1ac228012f2ebff18a381eb831" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.101275 4913 scope.go:117] "RemoveContainer" containerID="9d88b9cedd658546c8a390a3d224c107b790388b961e27026f428156c4d4fd9c" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.102612 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d88b9cedd658546c8a390a3d224c107b790388b961e27026f428156c4d4fd9c\": container with ID starting with 9d88b9cedd658546c8a390a3d224c107b790388b961e27026f428156c4d4fd9c not found: ID does not exist" containerID="9d88b9cedd658546c8a390a3d224c107b790388b961e27026f428156c4d4fd9c" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.102640 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d88b9cedd658546c8a390a3d224c107b790388b961e27026f428156c4d4fd9c"} err="failed to get container status \"9d88b9cedd658546c8a390a3d224c107b790388b961e27026f428156c4d4fd9c\": rpc error: code = NotFound desc = could not find container \"9d88b9cedd658546c8a390a3d224c107b790388b961e27026f428156c4d4fd9c\": container with ID starting with 9d88b9cedd658546c8a390a3d224c107b790388b961e27026f428156c4d4fd9c not found: ID does not exist" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.102667 4913 scope.go:117] "RemoveContainer" containerID="0b47094481fdeedd251ea2178fea3907ee0401261fe356c8e890888e0c37506f" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.103043 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b47094481fdeedd251ea2178fea3907ee0401261fe356c8e890888e0c37506f\": container with ID starting with 0b47094481fdeedd251ea2178fea3907ee0401261fe356c8e890888e0c37506f not found: ID does not exist" containerID="0b47094481fdeedd251ea2178fea3907ee0401261fe356c8e890888e0c37506f" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.103119 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b47094481fdeedd251ea2178fea3907ee0401261fe356c8e890888e0c37506f"} err="failed to get container status \"0b47094481fdeedd251ea2178fea3907ee0401261fe356c8e890888e0c37506f\": rpc error: code = NotFound desc = could not find container \"0b47094481fdeedd251ea2178fea3907ee0401261fe356c8e890888e0c37506f\": container with ID starting with 0b47094481fdeedd251ea2178fea3907ee0401261fe356c8e890888e0c37506f not found: ID does not exist" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.103141 4913 scope.go:117] "RemoveContainer" containerID="56372340530d13dafd752b211a37f1ef29fd0b1ac228012f2ebff18a381eb831" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.103522 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56372340530d13dafd752b211a37f1ef29fd0b1ac228012f2ebff18a381eb831\": container with ID starting with 56372340530d13dafd752b211a37f1ef29fd0b1ac228012f2ebff18a381eb831 not found: ID does not exist" containerID="56372340530d13dafd752b211a37f1ef29fd0b1ac228012f2ebff18a381eb831" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.103640 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56372340530d13dafd752b211a37f1ef29fd0b1ac228012f2ebff18a381eb831"} err="failed to get container status \"56372340530d13dafd752b211a37f1ef29fd0b1ac228012f2ebff18a381eb831\": rpc error: code = NotFound desc = could not find container \"56372340530d13dafd752b211a37f1ef29fd0b1ac228012f2ebff18a381eb831\": container with ID starting with 56372340530d13dafd752b211a37f1ef29fd0b1ac228012f2ebff18a381eb831 not found: ID does not exist" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.103752 4913 scope.go:117] "RemoveContainer" containerID="99b33aaca64990db7675d0bdfb78859f629dd14144e637cfaca966e1dbb98091" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.120997 4913 scope.go:117] "RemoveContainer" containerID="7964bf01594497ecd6bbece1f4066551345e08e83f4dc9746f9bbe01c69394ca" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.136426 4913 scope.go:117] "RemoveContainer" containerID="ca413b07e5d6237f3c4e815f51587abadeb263ec2e5196082fff58c46bc44d07" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.153837 4913 scope.go:117] "RemoveContainer" containerID="99b33aaca64990db7675d0bdfb78859f629dd14144e637cfaca966e1dbb98091" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.154584 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99b33aaca64990db7675d0bdfb78859f629dd14144e637cfaca966e1dbb98091\": container with ID starting with 99b33aaca64990db7675d0bdfb78859f629dd14144e637cfaca966e1dbb98091 not found: ID does not exist" containerID="99b33aaca64990db7675d0bdfb78859f629dd14144e637cfaca966e1dbb98091" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.154625 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b33aaca64990db7675d0bdfb78859f629dd14144e637cfaca966e1dbb98091"} err="failed to get container status \"99b33aaca64990db7675d0bdfb78859f629dd14144e637cfaca966e1dbb98091\": rpc error: code = NotFound desc = could not find container \"99b33aaca64990db7675d0bdfb78859f629dd14144e637cfaca966e1dbb98091\": container with ID starting with 99b33aaca64990db7675d0bdfb78859f629dd14144e637cfaca966e1dbb98091 not found: ID does not exist" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.154654 4913 scope.go:117] "RemoveContainer" containerID="7964bf01594497ecd6bbece1f4066551345e08e83f4dc9746f9bbe01c69394ca" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.155791 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7964bf01594497ecd6bbece1f4066551345e08e83f4dc9746f9bbe01c69394ca\": container with ID starting with 7964bf01594497ecd6bbece1f4066551345e08e83f4dc9746f9bbe01c69394ca not found: ID does not exist" containerID="7964bf01594497ecd6bbece1f4066551345e08e83f4dc9746f9bbe01c69394ca" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.155830 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7964bf01594497ecd6bbece1f4066551345e08e83f4dc9746f9bbe01c69394ca"} err="failed to get container status \"7964bf01594497ecd6bbece1f4066551345e08e83f4dc9746f9bbe01c69394ca\": rpc error: code = NotFound desc = could not find container \"7964bf01594497ecd6bbece1f4066551345e08e83f4dc9746f9bbe01c69394ca\": container with ID starting with 7964bf01594497ecd6bbece1f4066551345e08e83f4dc9746f9bbe01c69394ca not found: ID does not exist" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.155864 4913 scope.go:117] "RemoveContainer" containerID="ca413b07e5d6237f3c4e815f51587abadeb263ec2e5196082fff58c46bc44d07" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.156361 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca413b07e5d6237f3c4e815f51587abadeb263ec2e5196082fff58c46bc44d07\": container with ID starting with ca413b07e5d6237f3c4e815f51587abadeb263ec2e5196082fff58c46bc44d07 not found: ID does not exist" containerID="ca413b07e5d6237f3c4e815f51587abadeb263ec2e5196082fff58c46bc44d07" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.156415 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca413b07e5d6237f3c4e815f51587abadeb263ec2e5196082fff58c46bc44d07"} err="failed to get container status \"ca413b07e5d6237f3c4e815f51587abadeb263ec2e5196082fff58c46bc44d07\": rpc error: code = NotFound desc = could not find container \"ca413b07e5d6237f3c4e815f51587abadeb263ec2e5196082fff58c46bc44d07\": container with ID starting with ca413b07e5d6237f3c4e815f51587abadeb263ec2e5196082fff58c46bc44d07 not found: ID does not exist" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.156451 4913 scope.go:117] "RemoveContainer" containerID="a0e4d9a12c449a729d7c2a65d470883b80655598903d0e12f154aac1a0e46af3" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.168674 4913 scope.go:117] "RemoveContainer" containerID="5314d4b822c9077f3b8a706a7e14386a1d560320b4403d400b30d3b563be1e12" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.189139 4913 scope.go:117] "RemoveContainer" containerID="e3efbcebcb8308b748183c5fefa54bb2b3dc25056260e61afb3775c28ee04031" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.203299 4913 scope.go:117] "RemoveContainer" containerID="a0e4d9a12c449a729d7c2a65d470883b80655598903d0e12f154aac1a0e46af3" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.203856 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e4d9a12c449a729d7c2a65d470883b80655598903d0e12f154aac1a0e46af3\": container with ID starting with a0e4d9a12c449a729d7c2a65d470883b80655598903d0e12f154aac1a0e46af3 not found: ID does not exist" containerID="a0e4d9a12c449a729d7c2a65d470883b80655598903d0e12f154aac1a0e46af3" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.203897 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e4d9a12c449a729d7c2a65d470883b80655598903d0e12f154aac1a0e46af3"} err="failed to get container status \"a0e4d9a12c449a729d7c2a65d470883b80655598903d0e12f154aac1a0e46af3\": rpc error: code = NotFound desc = could not find container \"a0e4d9a12c449a729d7c2a65d470883b80655598903d0e12f154aac1a0e46af3\": container with ID starting with a0e4d9a12c449a729d7c2a65d470883b80655598903d0e12f154aac1a0e46af3 not found: ID does not exist" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.203927 4913 scope.go:117] "RemoveContainer" containerID="5314d4b822c9077f3b8a706a7e14386a1d560320b4403d400b30d3b563be1e12" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.204216 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5314d4b822c9077f3b8a706a7e14386a1d560320b4403d400b30d3b563be1e12\": container with ID starting with 5314d4b822c9077f3b8a706a7e14386a1d560320b4403d400b30d3b563be1e12 not found: ID does not exist" containerID="5314d4b822c9077f3b8a706a7e14386a1d560320b4403d400b30d3b563be1e12" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.204261 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5314d4b822c9077f3b8a706a7e14386a1d560320b4403d400b30d3b563be1e12"} err="failed to get container status \"5314d4b822c9077f3b8a706a7e14386a1d560320b4403d400b30d3b563be1e12\": rpc error: code = NotFound desc = could not find container \"5314d4b822c9077f3b8a706a7e14386a1d560320b4403d400b30d3b563be1e12\": container with ID starting with 5314d4b822c9077f3b8a706a7e14386a1d560320b4403d400b30d3b563be1e12 not found: ID does not exist" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.204308 4913 scope.go:117] "RemoveContainer" containerID="e3efbcebcb8308b748183c5fefa54bb2b3dc25056260e61afb3775c28ee04031" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.204696 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3efbcebcb8308b748183c5fefa54bb2b3dc25056260e61afb3775c28ee04031\": container with ID starting with e3efbcebcb8308b748183c5fefa54bb2b3dc25056260e61afb3775c28ee04031 not found: ID does not exist" containerID="e3efbcebcb8308b748183c5fefa54bb2b3dc25056260e61afb3775c28ee04031" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.204736 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3efbcebcb8308b748183c5fefa54bb2b3dc25056260e61afb3775c28ee04031"} err="failed to get container status \"e3efbcebcb8308b748183c5fefa54bb2b3dc25056260e61afb3775c28ee04031\": rpc error: code = NotFound desc = could not find container \"e3efbcebcb8308b748183c5fefa54bb2b3dc25056260e61afb3775c28ee04031\": container with ID starting with e3efbcebcb8308b748183c5fefa54bb2b3dc25056260e61afb3775c28ee04031 not found: ID does not exist" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595109 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mrpb7"] Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.595352 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7bb5ac-942e-4b86-ad14-e6dd271d725a" containerName="extract-content" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595363 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7bb5ac-942e-4b86-ad14-e6dd271d725a" containerName="extract-content" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.595376 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de27ba3c-2707-4ab6-827e-b9d58f8968da" containerName="extract-content" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595382 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="de27ba3c-2707-4ab6-827e-b9d58f8968da" containerName="extract-content" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.595389 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7bb5ac-942e-4b86-ad14-e6dd271d725a" containerName="extract-utilities" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595395 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7bb5ac-942e-4b86-ad14-e6dd271d725a" containerName="extract-utilities" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.595402 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5b3988-8324-4401-b951-3b1e3fea763a" containerName="extract-utilities" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595408 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5b3988-8324-4401-b951-3b1e3fea763a" containerName="extract-utilities" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.595416 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de27ba3c-2707-4ab6-827e-b9d58f8968da" containerName="extract-utilities" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595421 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="de27ba3c-2707-4ab6-827e-b9d58f8968da" containerName="extract-utilities" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.595430 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89fe354d-c11c-4c4f-a2c8-309d9da44911" containerName="registry-server" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595436 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="89fe354d-c11c-4c4f-a2c8-309d9da44911" containerName="registry-server" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.595443 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5b3988-8324-4401-b951-3b1e3fea763a" containerName="registry-server" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595448 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5b3988-8324-4401-b951-3b1e3fea763a" containerName="registry-server" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.595456 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7bb5ac-942e-4b86-ad14-e6dd271d725a" containerName="registry-server" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595462 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7bb5ac-942e-4b86-ad14-e6dd271d725a" containerName="registry-server" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.595470 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89fe354d-c11c-4c4f-a2c8-309d9da44911" containerName="extract-content" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595475 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="89fe354d-c11c-4c4f-a2c8-309d9da44911" containerName="extract-content" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.595484 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89fe354d-c11c-4c4f-a2c8-309d9da44911" containerName="extract-utilities" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595489 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="89fe354d-c11c-4c4f-a2c8-309d9da44911" containerName="extract-utilities" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.595496 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5b3988-8324-4401-b951-3b1e3fea763a" containerName="extract-content" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595502 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5b3988-8324-4401-b951-3b1e3fea763a" containerName="extract-content" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.595512 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de27ba3c-2707-4ab6-827e-b9d58f8968da" containerName="registry-server" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595518 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="de27ba3c-2707-4ab6-827e-b9d58f8968da" containerName="registry-server" Oct 01 12:42:08 crc kubenswrapper[4913]: E1001 12:42:08.595526 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceea773f-549c-4d23-841c-a8e2ccb62f28" containerName="marketplace-operator" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595532 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceea773f-549c-4d23-841c-a8e2ccb62f28" containerName="marketplace-operator" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595610 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="89fe354d-c11c-4c4f-a2c8-309d9da44911" containerName="registry-server" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595620 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a5b3988-8324-4401-b951-3b1e3fea763a" containerName="registry-server" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595629 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="de27ba3c-2707-4ab6-827e-b9d58f8968da" containerName="registry-server" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595639 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7bb5ac-942e-4b86-ad14-e6dd271d725a" containerName="registry-server" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.595645 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceea773f-549c-4d23-841c-a8e2ccb62f28" containerName="marketplace-operator" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.596335 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.599371 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.602710 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mrpb7"] Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.791210 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f4d496a-b3d5-49d0-88bb-aa061f342fd3-utilities\") pod \"certified-operators-mrpb7\" (UID: \"5f4d496a-b3d5-49d0-88bb-aa061f342fd3\") " pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.791439 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f4d496a-b3d5-49d0-88bb-aa061f342fd3-catalog-content\") pod \"certified-operators-mrpb7\" (UID: \"5f4d496a-b3d5-49d0-88bb-aa061f342fd3\") " pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.791482 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx4cp\" (UniqueName: \"kubernetes.io/projected/5f4d496a-b3d5-49d0-88bb-aa061f342fd3-kube-api-access-sx4cp\") pod \"certified-operators-mrpb7\" (UID: \"5f4d496a-b3d5-49d0-88bb-aa061f342fd3\") " pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.812977 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a7bb5ac-942e-4b86-ad14-e6dd271d725a" path="/var/lib/kubelet/pods/2a7bb5ac-942e-4b86-ad14-e6dd271d725a/volumes" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.813793 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89fe354d-c11c-4c4f-a2c8-309d9da44911" path="/var/lib/kubelet/pods/89fe354d-c11c-4c4f-a2c8-309d9da44911/volumes" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.814640 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a5b3988-8324-4401-b951-3b1e3fea763a" path="/var/lib/kubelet/pods/9a5b3988-8324-4401-b951-3b1e3fea763a/volumes" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.815942 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceea773f-549c-4d23-841c-a8e2ccb62f28" path="/var/lib/kubelet/pods/ceea773f-549c-4d23-841c-a8e2ccb62f28/volumes" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.816494 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de27ba3c-2707-4ab6-827e-b9d58f8968da" path="/var/lib/kubelet/pods/de27ba3c-2707-4ab6-827e-b9d58f8968da/volumes" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.892742 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f4d496a-b3d5-49d0-88bb-aa061f342fd3-catalog-content\") pod \"certified-operators-mrpb7\" (UID: \"5f4d496a-b3d5-49d0-88bb-aa061f342fd3\") " pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.892776 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx4cp\" (UniqueName: \"kubernetes.io/projected/5f4d496a-b3d5-49d0-88bb-aa061f342fd3-kube-api-access-sx4cp\") pod \"certified-operators-mrpb7\" (UID: \"5f4d496a-b3d5-49d0-88bb-aa061f342fd3\") " pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.892825 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f4d496a-b3d5-49d0-88bb-aa061f342fd3-utilities\") pod \"certified-operators-mrpb7\" (UID: \"5f4d496a-b3d5-49d0-88bb-aa061f342fd3\") " pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.893198 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f4d496a-b3d5-49d0-88bb-aa061f342fd3-catalog-content\") pod \"certified-operators-mrpb7\" (UID: \"5f4d496a-b3d5-49d0-88bb-aa061f342fd3\") " pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.893235 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f4d496a-b3d5-49d0-88bb-aa061f342fd3-utilities\") pod \"certified-operators-mrpb7\" (UID: \"5f4d496a-b3d5-49d0-88bb-aa061f342fd3\") " pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.915548 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx4cp\" (UniqueName: \"kubernetes.io/projected/5f4d496a-b3d5-49d0-88bb-aa061f342fd3-kube-api-access-sx4cp\") pod \"certified-operators-mrpb7\" (UID: \"5f4d496a-b3d5-49d0-88bb-aa061f342fd3\") " pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:08 crc kubenswrapper[4913]: I1001 12:42:08.971593 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xgpws" Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.232946 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.597296 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mxbn7"] Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.598418 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.600789 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.610059 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxbn7"] Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.633015 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mrpb7"] Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.738346 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4gkl\" (UniqueName: \"kubernetes.io/projected/ed62703c-99a3-4c2f-8b04-286e67063932-kube-api-access-m4gkl\") pod \"redhat-marketplace-mxbn7\" (UID: \"ed62703c-99a3-4c2f-8b04-286e67063932\") " pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.738541 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed62703c-99a3-4c2f-8b04-286e67063932-utilities\") pod \"redhat-marketplace-mxbn7\" (UID: \"ed62703c-99a3-4c2f-8b04-286e67063932\") " pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.738683 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed62703c-99a3-4c2f-8b04-286e67063932-catalog-content\") pod \"redhat-marketplace-mxbn7\" (UID: \"ed62703c-99a3-4c2f-8b04-286e67063932\") " pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.839443 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed62703c-99a3-4c2f-8b04-286e67063932-catalog-content\") pod \"redhat-marketplace-mxbn7\" (UID: \"ed62703c-99a3-4c2f-8b04-286e67063932\") " pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.839514 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4gkl\" (UniqueName: \"kubernetes.io/projected/ed62703c-99a3-4c2f-8b04-286e67063932-kube-api-access-m4gkl\") pod \"redhat-marketplace-mxbn7\" (UID: \"ed62703c-99a3-4c2f-8b04-286e67063932\") " pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.839565 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed62703c-99a3-4c2f-8b04-286e67063932-utilities\") pod \"redhat-marketplace-mxbn7\" (UID: \"ed62703c-99a3-4c2f-8b04-286e67063932\") " pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.839961 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed62703c-99a3-4c2f-8b04-286e67063932-catalog-content\") pod \"redhat-marketplace-mxbn7\" (UID: \"ed62703c-99a3-4c2f-8b04-286e67063932\") " pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.839987 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed62703c-99a3-4c2f-8b04-286e67063932-utilities\") pod \"redhat-marketplace-mxbn7\" (UID: \"ed62703c-99a3-4c2f-8b04-286e67063932\") " pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.859776 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4gkl\" (UniqueName: \"kubernetes.io/projected/ed62703c-99a3-4c2f-8b04-286e67063932-kube-api-access-m4gkl\") pod \"redhat-marketplace-mxbn7\" (UID: \"ed62703c-99a3-4c2f-8b04-286e67063932\") " pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.935990 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.975829 4913 generic.go:334] "Generic (PLEG): container finished" podID="5f4d496a-b3d5-49d0-88bb-aa061f342fd3" containerID="13de7951e70fd4d025d73327b725b001d19d7d2cedb29141f3647bf08af1ff37" exitCode=0 Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.977359 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrpb7" event={"ID":"5f4d496a-b3d5-49d0-88bb-aa061f342fd3","Type":"ContainerDied","Data":"13de7951e70fd4d025d73327b725b001d19d7d2cedb29141f3647bf08af1ff37"} Oct 01 12:42:09 crc kubenswrapper[4913]: I1001 12:42:09.977412 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrpb7" event={"ID":"5f4d496a-b3d5-49d0-88bb-aa061f342fd3","Type":"ContainerStarted","Data":"323bfe6ff43ac3cbf89d14d8f805c0679808e2625e5f2c39ec30e373dcfa1a2c"} Oct 01 12:42:10 crc kubenswrapper[4913]: I1001 12:42:10.338353 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxbn7"] Oct 01 12:42:10 crc kubenswrapper[4913]: I1001 12:42:10.982586 4913 generic.go:334] "Generic (PLEG): container finished" podID="ed62703c-99a3-4c2f-8b04-286e67063932" containerID="b962aa6c00493b839df0fb3d4293ca5de6a89d2ce24105ed714feb0c112a7604" exitCode=0 Oct 01 12:42:10 crc kubenswrapper[4913]: I1001 12:42:10.982663 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxbn7" event={"ID":"ed62703c-99a3-4c2f-8b04-286e67063932","Type":"ContainerDied","Data":"b962aa6c00493b839df0fb3d4293ca5de6a89d2ce24105ed714feb0c112a7604"} Oct 01 12:42:10 crc kubenswrapper[4913]: I1001 12:42:10.982987 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxbn7" event={"ID":"ed62703c-99a3-4c2f-8b04-286e67063932","Type":"ContainerStarted","Data":"6efda259958aec111d860034b40e183e8c53eec946fd2ac479e25a3c656f8fee"} Oct 01 12:42:10 crc kubenswrapper[4913]: I1001 12:42:10.984905 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrpb7" event={"ID":"5f4d496a-b3d5-49d0-88bb-aa061f342fd3","Type":"ContainerStarted","Data":"3d59bd2754963cda7abb37892b8cc6406b9915509a9e3636fba5df7697b1b57f"} Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.002037 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rmfsv"] Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.003348 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.006015 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.012503 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rmfsv"] Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.160130 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzj26\" (UniqueName: \"kubernetes.io/projected/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-kube-api-access-tzj26\") pod \"redhat-operators-rmfsv\" (UID: \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\") " pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.161168 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-catalog-content\") pod \"redhat-operators-rmfsv\" (UID: \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\") " pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.161288 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-utilities\") pod \"redhat-operators-rmfsv\" (UID: \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\") " pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.262281 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-catalog-content\") pod \"redhat-operators-rmfsv\" (UID: \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\") " pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.262598 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-utilities\") pod \"redhat-operators-rmfsv\" (UID: \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\") " pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.262696 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzj26\" (UniqueName: \"kubernetes.io/projected/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-kube-api-access-tzj26\") pod \"redhat-operators-rmfsv\" (UID: \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\") " pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.262737 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-catalog-content\") pod \"redhat-operators-rmfsv\" (UID: \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\") " pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.263067 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-utilities\") pod \"redhat-operators-rmfsv\" (UID: \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\") " pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.280200 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzj26\" (UniqueName: \"kubernetes.io/projected/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-kube-api-access-tzj26\") pod \"redhat-operators-rmfsv\" (UID: \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\") " pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.319752 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.726825 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rmfsv"] Oct 01 12:42:11 crc kubenswrapper[4913]: W1001 12:42:11.732402 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0135ef10_a28c_42de_bc9c_bdc0cd20e8e1.slice/crio-1e3695d895137762bb9f4aa110bdd78e1ee6f7e14432af73fc297b0a61a6e03b WatchSource:0}: Error finding container 1e3695d895137762bb9f4aa110bdd78e1ee6f7e14432af73fc297b0a61a6e03b: Status 404 returned error can't find the container with id 1e3695d895137762bb9f4aa110bdd78e1ee6f7e14432af73fc297b0a61a6e03b Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.997623 4913 generic.go:334] "Generic (PLEG): container finished" podID="0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" containerID="a7490f0b87c3d63418e19d97ccaafd7b957892ed220ad3d3f88391879233436e" exitCode=0 Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.998252 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmfsv" event={"ID":"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1","Type":"ContainerDied","Data":"a7490f0b87c3d63418e19d97ccaafd7b957892ed220ad3d3f88391879233436e"} Oct 01 12:42:11 crc kubenswrapper[4913]: I1001 12:42:11.998298 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmfsv" event={"ID":"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1","Type":"ContainerStarted","Data":"1e3695d895137762bb9f4aa110bdd78e1ee6f7e14432af73fc297b0a61a6e03b"} Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.000144 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dl8lx"] Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.005155 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.007232 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.011010 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dl8lx"] Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.011553 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxbn7" event={"ID":"ed62703c-99a3-4c2f-8b04-286e67063932","Type":"ContainerStarted","Data":"6409e743a2538fd6b08172f5d20c048eb9860140ff0e0f7102720ada47287a91"} Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.017076 4913 generic.go:334] "Generic (PLEG): container finished" podID="5f4d496a-b3d5-49d0-88bb-aa061f342fd3" containerID="3d59bd2754963cda7abb37892b8cc6406b9915509a9e3636fba5df7697b1b57f" exitCode=0 Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.017475 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrpb7" event={"ID":"5f4d496a-b3d5-49d0-88bb-aa061f342fd3","Type":"ContainerDied","Data":"3d59bd2754963cda7abb37892b8cc6406b9915509a9e3636fba5df7697b1b57f"} Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.172481 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed53f181-8b36-4a70-a904-871780dda5cf-catalog-content\") pod \"community-operators-dl8lx\" (UID: \"ed53f181-8b36-4a70-a904-871780dda5cf\") " pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.172558 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svwdq\" (UniqueName: \"kubernetes.io/projected/ed53f181-8b36-4a70-a904-871780dda5cf-kube-api-access-svwdq\") pod \"community-operators-dl8lx\" (UID: \"ed53f181-8b36-4a70-a904-871780dda5cf\") " pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.172629 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed53f181-8b36-4a70-a904-871780dda5cf-utilities\") pod \"community-operators-dl8lx\" (UID: \"ed53f181-8b36-4a70-a904-871780dda5cf\") " pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.273790 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed53f181-8b36-4a70-a904-871780dda5cf-utilities\") pod \"community-operators-dl8lx\" (UID: \"ed53f181-8b36-4a70-a904-871780dda5cf\") " pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.273834 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed53f181-8b36-4a70-a904-871780dda5cf-catalog-content\") pod \"community-operators-dl8lx\" (UID: \"ed53f181-8b36-4a70-a904-871780dda5cf\") " pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.273868 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svwdq\" (UniqueName: \"kubernetes.io/projected/ed53f181-8b36-4a70-a904-871780dda5cf-kube-api-access-svwdq\") pod \"community-operators-dl8lx\" (UID: \"ed53f181-8b36-4a70-a904-871780dda5cf\") " pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.274323 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed53f181-8b36-4a70-a904-871780dda5cf-utilities\") pod \"community-operators-dl8lx\" (UID: \"ed53f181-8b36-4a70-a904-871780dda5cf\") " pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.274389 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed53f181-8b36-4a70-a904-871780dda5cf-catalog-content\") pod \"community-operators-dl8lx\" (UID: \"ed53f181-8b36-4a70-a904-871780dda5cf\") " pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.293505 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svwdq\" (UniqueName: \"kubernetes.io/projected/ed53f181-8b36-4a70-a904-871780dda5cf-kube-api-access-svwdq\") pod \"community-operators-dl8lx\" (UID: \"ed53f181-8b36-4a70-a904-871780dda5cf\") " pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.362516 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:42:12 crc kubenswrapper[4913]: I1001 12:42:12.779077 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dl8lx"] Oct 01 12:42:12 crc kubenswrapper[4913]: W1001 12:42:12.790026 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded53f181_8b36_4a70_a904_871780dda5cf.slice/crio-b1d1f98a65231a643a3b6b98543c8785ca26f04ee3537334a849673036472901 WatchSource:0}: Error finding container b1d1f98a65231a643a3b6b98543c8785ca26f04ee3537334a849673036472901: Status 404 returned error can't find the container with id b1d1f98a65231a643a3b6b98543c8785ca26f04ee3537334a849673036472901 Oct 01 12:42:13 crc kubenswrapper[4913]: I1001 12:42:13.023488 4913 generic.go:334] "Generic (PLEG): container finished" podID="ed53f181-8b36-4a70-a904-871780dda5cf" containerID="1e79a1c4c6627b600c45819adddae3fc9f18ea9f63753e481689a55d6aec9d52" exitCode=0 Oct 01 12:42:13 crc kubenswrapper[4913]: I1001 12:42:13.023654 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dl8lx" event={"ID":"ed53f181-8b36-4a70-a904-871780dda5cf","Type":"ContainerDied","Data":"1e79a1c4c6627b600c45819adddae3fc9f18ea9f63753e481689a55d6aec9d52"} Oct 01 12:42:13 crc kubenswrapper[4913]: I1001 12:42:13.023866 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dl8lx" event={"ID":"ed53f181-8b36-4a70-a904-871780dda5cf","Type":"ContainerStarted","Data":"b1d1f98a65231a643a3b6b98543c8785ca26f04ee3537334a849673036472901"} Oct 01 12:42:13 crc kubenswrapper[4913]: I1001 12:42:13.027130 4913 generic.go:334] "Generic (PLEG): container finished" podID="ed62703c-99a3-4c2f-8b04-286e67063932" containerID="6409e743a2538fd6b08172f5d20c048eb9860140ff0e0f7102720ada47287a91" exitCode=0 Oct 01 12:42:13 crc kubenswrapper[4913]: I1001 12:42:13.027196 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxbn7" event={"ID":"ed62703c-99a3-4c2f-8b04-286e67063932","Type":"ContainerDied","Data":"6409e743a2538fd6b08172f5d20c048eb9860140ff0e0f7102720ada47287a91"} Oct 01 12:42:13 crc kubenswrapper[4913]: I1001 12:42:13.027223 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxbn7" event={"ID":"ed62703c-99a3-4c2f-8b04-286e67063932","Type":"ContainerStarted","Data":"2d8140866fd29866c1b1df98b6a72e4cabde371d999dddebcc04d752e9b9011f"} Oct 01 12:42:13 crc kubenswrapper[4913]: I1001 12:42:13.030345 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrpb7" event={"ID":"5f4d496a-b3d5-49d0-88bb-aa061f342fd3","Type":"ContainerStarted","Data":"f655bbabecad64ecb3fed4729037daebc4b883ce965e7bd32d017e41c5e5524b"} Oct 01 12:42:13 crc kubenswrapper[4913]: I1001 12:42:13.067899 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mxbn7" podStartSLOduration=2.582947097 podStartE2EDuration="4.067879182s" podCreationTimestamp="2025-10-01 12:42:09 +0000 UTC" firstStartedPulling="2025-10-01 12:42:10.985018905 +0000 UTC m=+262.888494483" lastFinishedPulling="2025-10-01 12:42:12.46995099 +0000 UTC m=+264.373426568" observedRunningTime="2025-10-01 12:42:13.067013968 +0000 UTC m=+264.970489596" watchObservedRunningTime="2025-10-01 12:42:13.067879182 +0000 UTC m=+264.971354750" Oct 01 12:42:13 crc kubenswrapper[4913]: I1001 12:42:13.086390 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mrpb7" podStartSLOduration=2.363518435 podStartE2EDuration="5.086369736s" podCreationTimestamp="2025-10-01 12:42:08 +0000 UTC" firstStartedPulling="2025-10-01 12:42:09.978211052 +0000 UTC m=+261.881686630" lastFinishedPulling="2025-10-01 12:42:12.701062353 +0000 UTC m=+264.604537931" observedRunningTime="2025-10-01 12:42:13.081147931 +0000 UTC m=+264.984623539" watchObservedRunningTime="2025-10-01 12:42:13.086369736 +0000 UTC m=+264.989845334" Oct 01 12:42:14 crc kubenswrapper[4913]: I1001 12:42:14.036585 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dl8lx" event={"ID":"ed53f181-8b36-4a70-a904-871780dda5cf","Type":"ContainerStarted","Data":"d3a1959fcc8f1278360cdd34f87517da54551b87286d91c85fe1fc98a83ae6f9"} Oct 01 12:42:14 crc kubenswrapper[4913]: I1001 12:42:14.038293 4913 generic.go:334] "Generic (PLEG): container finished" podID="0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" containerID="b6539e816951278b52c58a646f80f9a437811a81ea7ecf0f4110b022418a5a9d" exitCode=0 Oct 01 12:42:14 crc kubenswrapper[4913]: I1001 12:42:14.038326 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmfsv" event={"ID":"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1","Type":"ContainerDied","Data":"b6539e816951278b52c58a646f80f9a437811a81ea7ecf0f4110b022418a5a9d"} Oct 01 12:42:15 crc kubenswrapper[4913]: I1001 12:42:15.045204 4913 generic.go:334] "Generic (PLEG): container finished" podID="ed53f181-8b36-4a70-a904-871780dda5cf" containerID="d3a1959fcc8f1278360cdd34f87517da54551b87286d91c85fe1fc98a83ae6f9" exitCode=0 Oct 01 12:42:15 crc kubenswrapper[4913]: I1001 12:42:15.045260 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dl8lx" event={"ID":"ed53f181-8b36-4a70-a904-871780dda5cf","Type":"ContainerDied","Data":"d3a1959fcc8f1278360cdd34f87517da54551b87286d91c85fe1fc98a83ae6f9"} Oct 01 12:42:16 crc kubenswrapper[4913]: I1001 12:42:16.052487 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmfsv" event={"ID":"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1","Type":"ContainerStarted","Data":"49bbc5287fd65ee2b200e7f79525aeb22bdb2947f52164747dcfeee598f660b0"} Oct 01 12:42:16 crc kubenswrapper[4913]: I1001 12:42:16.055832 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dl8lx" event={"ID":"ed53f181-8b36-4a70-a904-871780dda5cf","Type":"ContainerStarted","Data":"34836e1cd1171c92def4af5086cee77228df807f5ca085e6c5c6da85ba52de13"} Oct 01 12:42:16 crc kubenswrapper[4913]: I1001 12:42:16.070561 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rmfsv" podStartSLOduration=3.490657924 podStartE2EDuration="6.070547293s" podCreationTimestamp="2025-10-01 12:42:10 +0000 UTC" firstStartedPulling="2025-10-01 12:42:12.003054165 +0000 UTC m=+263.906529743" lastFinishedPulling="2025-10-01 12:42:14.582943544 +0000 UTC m=+266.486419112" observedRunningTime="2025-10-01 12:42:16.070009948 +0000 UTC m=+267.973485536" watchObservedRunningTime="2025-10-01 12:42:16.070547293 +0000 UTC m=+267.974022871" Oct 01 12:42:16 crc kubenswrapper[4913]: I1001 12:42:16.093409 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dl8lx" podStartSLOduration=2.579419287 podStartE2EDuration="5.093392987s" podCreationTimestamp="2025-10-01 12:42:11 +0000 UTC" firstStartedPulling="2025-10-01 12:42:13.024804987 +0000 UTC m=+264.928280565" lastFinishedPulling="2025-10-01 12:42:15.538778687 +0000 UTC m=+267.442254265" observedRunningTime="2025-10-01 12:42:16.090137557 +0000 UTC m=+267.993613145" watchObservedRunningTime="2025-10-01 12:42:16.093392987 +0000 UTC m=+267.996868565" Oct 01 12:42:19 crc kubenswrapper[4913]: I1001 12:42:19.234562 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:19 crc kubenswrapper[4913]: I1001 12:42:19.235158 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:19 crc kubenswrapper[4913]: I1001 12:42:19.279570 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:19 crc kubenswrapper[4913]: I1001 12:42:19.936743 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:19 crc kubenswrapper[4913]: I1001 12:42:19.936801 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:19 crc kubenswrapper[4913]: I1001 12:42:19.973571 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:20 crc kubenswrapper[4913]: I1001 12:42:20.112226 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mrpb7" Oct 01 12:42:20 crc kubenswrapper[4913]: I1001 12:42:20.114764 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mxbn7" Oct 01 12:42:21 crc kubenswrapper[4913]: I1001 12:42:21.321501 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:21 crc kubenswrapper[4913]: I1001 12:42:21.321815 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:21 crc kubenswrapper[4913]: I1001 12:42:21.361294 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:22 crc kubenswrapper[4913]: I1001 12:42:22.129604 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 12:42:22 crc kubenswrapper[4913]: I1001 12:42:22.363635 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:42:22 crc kubenswrapper[4913]: I1001 12:42:22.363706 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:42:22 crc kubenswrapper[4913]: I1001 12:42:22.405615 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:42:23 crc kubenswrapper[4913]: I1001 12:42:23.126447 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dl8lx" Oct 01 12:43:40 crc kubenswrapper[4913]: I1001 12:43:40.083688 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:43:40 crc kubenswrapper[4913]: I1001 12:43:40.084329 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:44:10 crc kubenswrapper[4913]: I1001 12:44:10.084245 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:44:10 crc kubenswrapper[4913]: I1001 12:44:10.084903 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:44:40 crc kubenswrapper[4913]: I1001 12:44:40.084149 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:44:40 crc kubenswrapper[4913]: I1001 12:44:40.084632 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:44:40 crc kubenswrapper[4913]: I1001 12:44:40.084672 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:44:40 crc kubenswrapper[4913]: I1001 12:44:40.085296 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96dd2b868a1064bbecfc8916fb08a36877895d66c2075b2711ca53f620f29f26"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:44:40 crc kubenswrapper[4913]: I1001 12:44:40.085348 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://96dd2b868a1064bbecfc8916fb08a36877895d66c2075b2711ca53f620f29f26" gracePeriod=600 Oct 01 12:44:40 crc kubenswrapper[4913]: I1001 12:44:40.912254 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="96dd2b868a1064bbecfc8916fb08a36877895d66c2075b2711ca53f620f29f26" exitCode=0 Oct 01 12:44:40 crc kubenswrapper[4913]: I1001 12:44:40.912345 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"96dd2b868a1064bbecfc8916fb08a36877895d66c2075b2711ca53f620f29f26"} Oct 01 12:44:40 crc kubenswrapper[4913]: I1001 12:44:40.912737 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"1cc68559427fcac3c481bc75066fa47eb6ec40478fc203e9f25d7c355d20a2fd"} Oct 01 12:44:40 crc kubenswrapper[4913]: I1001 12:44:40.912789 4913 scope.go:117] "RemoveContainer" containerID="d9021cfd8749472265589aa1b741073f546d4ac9557b29c76134f47d4c61e91a" Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.135294 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r"] Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.136556 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.138406 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.140501 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.144452 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r"] Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.304089 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-secret-volume\") pod \"collect-profiles-29322045-gbt8r\" (UID: \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.304608 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-config-volume\") pod \"collect-profiles-29322045-gbt8r\" (UID: \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.304674 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jznrc\" (UniqueName: \"kubernetes.io/projected/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-kube-api-access-jznrc\") pod \"collect-profiles-29322045-gbt8r\" (UID: \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.406212 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jznrc\" (UniqueName: \"kubernetes.io/projected/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-kube-api-access-jznrc\") pod \"collect-profiles-29322045-gbt8r\" (UID: \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.406389 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-secret-volume\") pod \"collect-profiles-29322045-gbt8r\" (UID: \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.406491 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-config-volume\") pod \"collect-profiles-29322045-gbt8r\" (UID: \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.407301 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-config-volume\") pod \"collect-profiles-29322045-gbt8r\" (UID: \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.419417 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-secret-volume\") pod \"collect-profiles-29322045-gbt8r\" (UID: \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.421866 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jznrc\" (UniqueName: \"kubernetes.io/projected/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-kube-api-access-jznrc\") pod \"collect-profiles-29322045-gbt8r\" (UID: \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.458695 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" Oct 01 12:45:00 crc kubenswrapper[4913]: I1001 12:45:00.637606 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r"] Oct 01 12:45:01 crc kubenswrapper[4913]: I1001 12:45:01.038176 4913 generic.go:334] "Generic (PLEG): container finished" podID="3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e" containerID="280737ff584f454f5e74c72fe99cc660a6076c1416c8bfe3c96f4cd9fa66789b" exitCode=0 Oct 01 12:45:01 crc kubenswrapper[4913]: I1001 12:45:01.038221 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" event={"ID":"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e","Type":"ContainerDied","Data":"280737ff584f454f5e74c72fe99cc660a6076c1416c8bfe3c96f4cd9fa66789b"} Oct 01 12:45:01 crc kubenswrapper[4913]: I1001 12:45:01.038248 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" event={"ID":"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e","Type":"ContainerStarted","Data":"f27f1c4ab51ce58615d964a06d99d3c9fd6ce932ad43f3bef999f513bb900e0d"} Oct 01 12:45:02 crc kubenswrapper[4913]: I1001 12:45:02.283242 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" Oct 01 12:45:02 crc kubenswrapper[4913]: I1001 12:45:02.434308 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jznrc\" (UniqueName: \"kubernetes.io/projected/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-kube-api-access-jznrc\") pod \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\" (UID: \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\") " Oct 01 12:45:02 crc kubenswrapper[4913]: I1001 12:45:02.434405 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-secret-volume\") pod \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\" (UID: \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\") " Oct 01 12:45:02 crc kubenswrapper[4913]: I1001 12:45:02.434464 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-config-volume\") pod \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\" (UID: \"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e\") " Oct 01 12:45:02 crc kubenswrapper[4913]: I1001 12:45:02.435398 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-config-volume" (OuterVolumeSpecName: "config-volume") pod "3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e" (UID: "3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:45:02 crc kubenswrapper[4913]: I1001 12:45:02.442500 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e" (UID: "3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:45:02 crc kubenswrapper[4913]: I1001 12:45:02.442532 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-kube-api-access-jznrc" (OuterVolumeSpecName: "kube-api-access-jznrc") pod "3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e" (UID: "3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e"). InnerVolumeSpecName "kube-api-access-jznrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:45:02 crc kubenswrapper[4913]: I1001 12:45:02.535798 4913 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:02 crc kubenswrapper[4913]: I1001 12:45:02.535831 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jznrc\" (UniqueName: \"kubernetes.io/projected/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-kube-api-access-jznrc\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:02 crc kubenswrapper[4913]: I1001 12:45:02.535876 4913 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:03 crc kubenswrapper[4913]: I1001 12:45:03.051195 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" event={"ID":"3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e","Type":"ContainerDied","Data":"f27f1c4ab51ce58615d964a06d99d3c9fd6ce932ad43f3bef999f513bb900e0d"} Oct 01 12:45:03 crc kubenswrapper[4913]: I1001 12:45:03.051243 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r" Oct 01 12:45:03 crc kubenswrapper[4913]: I1001 12:45:03.051249 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f27f1c4ab51ce58615d964a06d99d3c9fd6ce932ad43f3bef999f513bb900e0d" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.625150 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l49s7"] Oct 01 12:45:58 crc kubenswrapper[4913]: E1001 12:45:58.626213 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e" containerName="collect-profiles" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.626230 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e" containerName="collect-profiles" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.626422 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e" containerName="collect-profiles" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.626925 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.647029 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l49s7"] Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.798087 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6djf\" (UniqueName: \"kubernetes.io/projected/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-kube-api-access-z6djf\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.798132 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-bound-sa-token\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.798151 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-trusted-ca\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.798172 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.798198 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.798221 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-registry-certificates\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.798239 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-registry-tls\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.798320 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.824720 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.898946 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-registry-certificates\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.898994 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-registry-tls\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.899032 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.899071 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6djf\" (UniqueName: \"kubernetes.io/projected/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-kube-api-access-z6djf\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.899086 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-bound-sa-token\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.899100 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-trusted-ca\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.899306 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.899739 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.900168 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-registry-certificates\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.900924 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-trusted-ca\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.907972 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-registry-tls\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.908336 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.921483 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-bound-sa-token\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.922819 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6djf\" (UniqueName: \"kubernetes.io/projected/e9537bbb-9cd5-46d5-8cf7-6533c378eb0a-kube-api-access-z6djf\") pod \"image-registry-66df7c8f76-l49s7\" (UID: \"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a\") " pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:58 crc kubenswrapper[4913]: I1001 12:45:58.950944 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:59 crc kubenswrapper[4913]: I1001 12:45:59.119964 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l49s7"] Oct 01 12:45:59 crc kubenswrapper[4913]: I1001 12:45:59.405916 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" event={"ID":"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a","Type":"ContainerStarted","Data":"2a2b990886f31ceab58851a546655c06df7f9ca6c999bb4211179f54f2fbd0d6"} Oct 01 12:45:59 crc kubenswrapper[4913]: I1001 12:45:59.406337 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:45:59 crc kubenswrapper[4913]: I1001 12:45:59.406356 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" event={"ID":"e9537bbb-9cd5-46d5-8cf7-6533c378eb0a","Type":"ContainerStarted","Data":"81496a3e869306eb9bb490e31d61c3170d9819452e55cb62d519754d8066b5fa"} Oct 01 12:45:59 crc kubenswrapper[4913]: I1001 12:45:59.424795 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" podStartSLOduration=1.4247717930000001 podStartE2EDuration="1.424771793s" podCreationTimestamp="2025-10-01 12:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:45:59.421657855 +0000 UTC m=+491.325133533" watchObservedRunningTime="2025-10-01 12:45:59.424771793 +0000 UTC m=+491.328247401" Oct 01 12:46:18 crc kubenswrapper[4913]: I1001 12:46:18.955763 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-l49s7" Oct 01 12:46:19 crc kubenswrapper[4913]: I1001 12:46:19.005403 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lhq4g"] Oct 01 12:46:40 crc kubenswrapper[4913]: I1001 12:46:40.084262 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:46:40 crc kubenswrapper[4913]: I1001 12:46:40.084926 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.060786 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" podUID="bd1e4468-40ba-4a47-8f89-99de7fec4071" containerName="registry" containerID="cri-o://5244ed9b71f94359f50613196492f3ae22660bab48223204988695cff70b5f94" gracePeriod=30 Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.418216 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.428813 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"bd1e4468-40ba-4a47-8f89-99de7fec4071\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.428855 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd1e4468-40ba-4a47-8f89-99de7fec4071-ca-trust-extracted\") pod \"bd1e4468-40ba-4a47-8f89-99de7fec4071\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.428877 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd1e4468-40ba-4a47-8f89-99de7fec4071-trusted-ca\") pod \"bd1e4468-40ba-4a47-8f89-99de7fec4071\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.428901 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xddzm\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-kube-api-access-xddzm\") pod \"bd1e4468-40ba-4a47-8f89-99de7fec4071\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.428917 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-registry-tls\") pod \"bd1e4468-40ba-4a47-8f89-99de7fec4071\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.428940 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd1e4468-40ba-4a47-8f89-99de7fec4071-installation-pull-secrets\") pod \"bd1e4468-40ba-4a47-8f89-99de7fec4071\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.428968 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd1e4468-40ba-4a47-8f89-99de7fec4071-registry-certificates\") pod \"bd1e4468-40ba-4a47-8f89-99de7fec4071\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.428997 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-bound-sa-token\") pod \"bd1e4468-40ba-4a47-8f89-99de7fec4071\" (UID: \"bd1e4468-40ba-4a47-8f89-99de7fec4071\") " Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.429744 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd1e4468-40ba-4a47-8f89-99de7fec4071-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bd1e4468-40ba-4a47-8f89-99de7fec4071" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.430020 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd1e4468-40ba-4a47-8f89-99de7fec4071-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bd1e4468-40ba-4a47-8f89-99de7fec4071" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.434676 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "bd1e4468-40ba-4a47-8f89-99de7fec4071" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.435418 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-kube-api-access-xddzm" (OuterVolumeSpecName: "kube-api-access-xddzm") pod "bd1e4468-40ba-4a47-8f89-99de7fec4071" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071"). InnerVolumeSpecName "kube-api-access-xddzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.435760 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1e4468-40ba-4a47-8f89-99de7fec4071-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bd1e4468-40ba-4a47-8f89-99de7fec4071" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.442249 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bd1e4468-40ba-4a47-8f89-99de7fec4071" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.449187 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "bd1e4468-40ba-4a47-8f89-99de7fec4071" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.454239 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1e4468-40ba-4a47-8f89-99de7fec4071-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bd1e4468-40ba-4a47-8f89-99de7fec4071" (UID: "bd1e4468-40ba-4a47-8f89-99de7fec4071"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.533812 4913 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd1e4468-40ba-4a47-8f89-99de7fec4071-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.534656 4913 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.534734 4913 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd1e4468-40ba-4a47-8f89-99de7fec4071-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.534784 4913 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd1e4468-40ba-4a47-8f89-99de7fec4071-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.534819 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xddzm\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-kube-api-access-xddzm\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.534840 4913 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd1e4468-40ba-4a47-8f89-99de7fec4071-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.534863 4913 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd1e4468-40ba-4a47-8f89-99de7fec4071-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.691335 4913 generic.go:334] "Generic (PLEG): container finished" podID="bd1e4468-40ba-4a47-8f89-99de7fec4071" containerID="5244ed9b71f94359f50613196492f3ae22660bab48223204988695cff70b5f94" exitCode=0 Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.691395 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" event={"ID":"bd1e4468-40ba-4a47-8f89-99de7fec4071","Type":"ContainerDied","Data":"5244ed9b71f94359f50613196492f3ae22660bab48223204988695cff70b5f94"} Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.691408 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.691421 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lhq4g" event={"ID":"bd1e4468-40ba-4a47-8f89-99de7fec4071","Type":"ContainerDied","Data":"2ef51dbd023380cfc77eaf2705a0e6d63ffc7134343f7aa085bc15867fa215a6"} Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.691437 4913 scope.go:117] "RemoveContainer" containerID="5244ed9b71f94359f50613196492f3ae22660bab48223204988695cff70b5f94" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.705694 4913 scope.go:117] "RemoveContainer" containerID="5244ed9b71f94359f50613196492f3ae22660bab48223204988695cff70b5f94" Oct 01 12:46:44 crc kubenswrapper[4913]: E1001 12:46:44.706120 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5244ed9b71f94359f50613196492f3ae22660bab48223204988695cff70b5f94\": container with ID starting with 5244ed9b71f94359f50613196492f3ae22660bab48223204988695cff70b5f94 not found: ID does not exist" containerID="5244ed9b71f94359f50613196492f3ae22660bab48223204988695cff70b5f94" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.706453 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5244ed9b71f94359f50613196492f3ae22660bab48223204988695cff70b5f94"} err="failed to get container status \"5244ed9b71f94359f50613196492f3ae22660bab48223204988695cff70b5f94\": rpc error: code = NotFound desc = could not find container \"5244ed9b71f94359f50613196492f3ae22660bab48223204988695cff70b5f94\": container with ID starting with 5244ed9b71f94359f50613196492f3ae22660bab48223204988695cff70b5f94 not found: ID does not exist" Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.721818 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lhq4g"] Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.723852 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lhq4g"] Oct 01 12:46:44 crc kubenswrapper[4913]: I1001 12:46:44.813553 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd1e4468-40ba-4a47-8f89-99de7fec4071" path="/var/lib/kubelet/pods/bd1e4468-40ba-4a47-8f89-99de7fec4071/volumes" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.453452 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-m2cpf"] Oct 01 12:47:04 crc kubenswrapper[4913]: E1001 12:47:04.454150 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1e4468-40ba-4a47-8f89-99de7fec4071" containerName="registry" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.454164 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1e4468-40ba-4a47-8f89-99de7fec4071" containerName="registry" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.454252 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1e4468-40ba-4a47-8f89-99de7fec4071" containerName="registry" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.454649 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-m2cpf" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.456884 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.457017 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.457048 4913 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-vjjj5" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.460367 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-m2cpf"] Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.468303 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-n8tcp"] Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.468934 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-n8tcp" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.470600 4913 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-phbfm" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.483198 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2z957"] Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.483793 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-2z957" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.486090 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-n8tcp"] Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.487294 4913 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-2k25l" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.503553 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2z957"] Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.590539 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5lgg\" (UniqueName: \"kubernetes.io/projected/30c02957-87a8-4ab3-bbcf-9248f4c9ffc6-kube-api-access-c5lgg\") pod \"cert-manager-cainjector-7f985d654d-m2cpf\" (UID: \"30c02957-87a8-4ab3-bbcf-9248f4c9ffc6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-m2cpf" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.590587 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mctxl\" (UniqueName: \"kubernetes.io/projected/47e455bc-ca7f-42fc-a85d-720561425b25-kube-api-access-mctxl\") pod \"cert-manager-webhook-5655c58dd6-2z957\" (UID: \"47e455bc-ca7f-42fc-a85d-720561425b25\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2z957" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.590713 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjmqr\" (UniqueName: \"kubernetes.io/projected/f6dbba57-881b-4fcf-8c71-4a7aa5eb7bd7-kube-api-access-hjmqr\") pod \"cert-manager-5b446d88c5-n8tcp\" (UID: \"f6dbba57-881b-4fcf-8c71-4a7aa5eb7bd7\") " pod="cert-manager/cert-manager-5b446d88c5-n8tcp" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.692800 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjmqr\" (UniqueName: \"kubernetes.io/projected/f6dbba57-881b-4fcf-8c71-4a7aa5eb7bd7-kube-api-access-hjmqr\") pod \"cert-manager-5b446d88c5-n8tcp\" (UID: \"f6dbba57-881b-4fcf-8c71-4a7aa5eb7bd7\") " pod="cert-manager/cert-manager-5b446d88c5-n8tcp" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.692869 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5lgg\" (UniqueName: \"kubernetes.io/projected/30c02957-87a8-4ab3-bbcf-9248f4c9ffc6-kube-api-access-c5lgg\") pod \"cert-manager-cainjector-7f985d654d-m2cpf\" (UID: \"30c02957-87a8-4ab3-bbcf-9248f4c9ffc6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-m2cpf" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.692898 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mctxl\" (UniqueName: \"kubernetes.io/projected/47e455bc-ca7f-42fc-a85d-720561425b25-kube-api-access-mctxl\") pod \"cert-manager-webhook-5655c58dd6-2z957\" (UID: \"47e455bc-ca7f-42fc-a85d-720561425b25\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2z957" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.713214 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5lgg\" (UniqueName: \"kubernetes.io/projected/30c02957-87a8-4ab3-bbcf-9248f4c9ffc6-kube-api-access-c5lgg\") pod \"cert-manager-cainjector-7f985d654d-m2cpf\" (UID: \"30c02957-87a8-4ab3-bbcf-9248f4c9ffc6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-m2cpf" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.714489 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjmqr\" (UniqueName: \"kubernetes.io/projected/f6dbba57-881b-4fcf-8c71-4a7aa5eb7bd7-kube-api-access-hjmqr\") pod \"cert-manager-5b446d88c5-n8tcp\" (UID: \"f6dbba57-881b-4fcf-8c71-4a7aa5eb7bd7\") " pod="cert-manager/cert-manager-5b446d88c5-n8tcp" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.720118 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mctxl\" (UniqueName: \"kubernetes.io/projected/47e455bc-ca7f-42fc-a85d-720561425b25-kube-api-access-mctxl\") pod \"cert-manager-webhook-5655c58dd6-2z957\" (UID: \"47e455bc-ca7f-42fc-a85d-720561425b25\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2z957" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.768493 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-m2cpf" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.779063 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-n8tcp" Oct 01 12:47:04 crc kubenswrapper[4913]: I1001 12:47:04.796295 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-2z957" Oct 01 12:47:05 crc kubenswrapper[4913]: I1001 12:47:05.001317 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-m2cpf"] Oct 01 12:47:05 crc kubenswrapper[4913]: I1001 12:47:05.014035 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:47:05 crc kubenswrapper[4913]: I1001 12:47:05.262700 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2z957"] Oct 01 12:47:05 crc kubenswrapper[4913]: W1001 12:47:05.265031 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47e455bc_ca7f_42fc_a85d_720561425b25.slice/crio-9f903768793b2712660896b0dacbe9520a587242633905ffa3b3af029eb7ef27 WatchSource:0}: Error finding container 9f903768793b2712660896b0dacbe9520a587242633905ffa3b3af029eb7ef27: Status 404 returned error can't find the container with id 9f903768793b2712660896b0dacbe9520a587242633905ffa3b3af029eb7ef27 Oct 01 12:47:05 crc kubenswrapper[4913]: I1001 12:47:05.266030 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-n8tcp"] Oct 01 12:47:05 crc kubenswrapper[4913]: W1001 12:47:05.271766 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6dbba57_881b_4fcf_8c71_4a7aa5eb7bd7.slice/crio-2789314b83059d341c50bcbbb0ccf46929d37173ac67d445654ec2b356cf1f03 WatchSource:0}: Error finding container 2789314b83059d341c50bcbbb0ccf46929d37173ac67d445654ec2b356cf1f03: Status 404 returned error can't find the container with id 2789314b83059d341c50bcbbb0ccf46929d37173ac67d445654ec2b356cf1f03 Oct 01 12:47:05 crc kubenswrapper[4913]: I1001 12:47:05.818918 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-2z957" event={"ID":"47e455bc-ca7f-42fc-a85d-720561425b25","Type":"ContainerStarted","Data":"9f903768793b2712660896b0dacbe9520a587242633905ffa3b3af029eb7ef27"} Oct 01 12:47:05 crc kubenswrapper[4913]: I1001 12:47:05.820640 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-m2cpf" event={"ID":"30c02957-87a8-4ab3-bbcf-9248f4c9ffc6","Type":"ContainerStarted","Data":"083f352061c6855aba362ad228f88e5a46d19cccf87882d1c70505a03e7b7e09"} Oct 01 12:47:05 crc kubenswrapper[4913]: I1001 12:47:05.822088 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-n8tcp" event={"ID":"f6dbba57-881b-4fcf-8c71-4a7aa5eb7bd7","Type":"ContainerStarted","Data":"2789314b83059d341c50bcbbb0ccf46929d37173ac67d445654ec2b356cf1f03"} Oct 01 12:47:07 crc kubenswrapper[4913]: I1001 12:47:07.832950 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-m2cpf" event={"ID":"30c02957-87a8-4ab3-bbcf-9248f4c9ffc6","Type":"ContainerStarted","Data":"1fd91ea8b48353c865d3680b9fdc9ef702ebb5a8f8390240715afb6ff06f98c9"} Oct 01 12:47:07 crc kubenswrapper[4913]: I1001 12:47:07.847659 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-m2cpf" podStartSLOduration=2.101516513 podStartE2EDuration="3.847639375s" podCreationTimestamp="2025-10-01 12:47:04 +0000 UTC" firstStartedPulling="2025-10-01 12:47:05.013779944 +0000 UTC m=+556.917255522" lastFinishedPulling="2025-10-01 12:47:06.759902806 +0000 UTC m=+558.663378384" observedRunningTime="2025-10-01 12:47:07.846309629 +0000 UTC m=+559.749785227" watchObservedRunningTime="2025-10-01 12:47:07.847639375 +0000 UTC m=+559.751114953" Oct 01 12:47:08 crc kubenswrapper[4913]: I1001 12:47:08.841456 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-2z957" event={"ID":"47e455bc-ca7f-42fc-a85d-720561425b25","Type":"ContainerStarted","Data":"6052c4cfaec99fb1e027b54d3e8381268597bfd1975ae1f1c149bde1a5c59713"} Oct 01 12:47:08 crc kubenswrapper[4913]: I1001 12:47:08.842668 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-2z957" Oct 01 12:47:08 crc kubenswrapper[4913]: I1001 12:47:08.843642 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-n8tcp" event={"ID":"f6dbba57-881b-4fcf-8c71-4a7aa5eb7bd7","Type":"ContainerStarted","Data":"b3fee4971cc437c1d9eb56af8fc58d65cb3d12c0e65a6a5950e2e1c107851231"} Oct 01 12:47:08 crc kubenswrapper[4913]: I1001 12:47:08.867282 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-n8tcp" podStartSLOduration=1.830912863 podStartE2EDuration="4.867252519s" podCreationTimestamp="2025-10-01 12:47:04 +0000 UTC" firstStartedPulling="2025-10-01 12:47:05.273756178 +0000 UTC m=+557.177231756" lastFinishedPulling="2025-10-01 12:47:08.310095834 +0000 UTC m=+560.213571412" observedRunningTime="2025-10-01 12:47:08.864309478 +0000 UTC m=+560.767785086" watchObservedRunningTime="2025-10-01 12:47:08.867252519 +0000 UTC m=+560.770728097" Oct 01 12:47:10 crc kubenswrapper[4913]: I1001 12:47:10.083785 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:47:10 crc kubenswrapper[4913]: I1001 12:47:10.085806 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:47:14 crc kubenswrapper[4913]: I1001 12:47:14.798941 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-2z957" Oct 01 12:47:14 crc kubenswrapper[4913]: I1001 12:47:14.813346 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-2z957" podStartSLOduration=7.833481226 podStartE2EDuration="10.81332837s" podCreationTimestamp="2025-10-01 12:47:04 +0000 UTC" firstStartedPulling="2025-10-01 12:47:05.266859137 +0000 UTC m=+557.170334715" lastFinishedPulling="2025-10-01 12:47:08.246706281 +0000 UTC m=+560.150181859" observedRunningTime="2025-10-01 12:47:08.899626371 +0000 UTC m=+560.803101999" watchObservedRunningTime="2025-10-01 12:47:14.81332837 +0000 UTC m=+566.716803968" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.141826 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-57qvb"] Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.142608 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovn-controller" containerID="cri-o://9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f" gracePeriod=30 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.142722 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="nbdb" containerID="cri-o://52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a" gracePeriod=30 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.142789 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="northd" containerID="cri-o://8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a" gracePeriod=30 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.142878 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8" gracePeriod=30 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.142959 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="kube-rbac-proxy-node" containerID="cri-o://b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1" gracePeriod=30 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.143010 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovn-acl-logging" containerID="cri-o://b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036" gracePeriod=30 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.143127 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="sbdb" containerID="cri-o://78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7" gracePeriod=30 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.185482 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" containerID="cri-o://b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d" gracePeriod=30 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.465098 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/3.log" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.468438 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovn-acl-logging/0.log" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.469169 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovn-controller/0.log" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.469933 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.531732 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-glx82"] Oct 01 12:47:15 crc kubenswrapper[4913]: E1001 12:47:15.532084 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="sbdb" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532182 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="sbdb" Oct 01 12:47:15 crc kubenswrapper[4913]: E1001 12:47:15.532201 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="northd" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532211 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="northd" Oct 01 12:47:15 crc kubenswrapper[4913]: E1001 12:47:15.532220 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532228 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: E1001 12:47:15.532288 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="kubecfg-setup" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532299 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="kubecfg-setup" Oct 01 12:47:15 crc kubenswrapper[4913]: E1001 12:47:15.532311 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="kube-rbac-proxy-node" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532369 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="kube-rbac-proxy-node" Oct 01 12:47:15 crc kubenswrapper[4913]: E1001 12:47:15.532383 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532392 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: E1001 12:47:15.532403 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovn-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532411 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovn-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: E1001 12:47:15.532422 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovn-acl-logging" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532455 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovn-acl-logging" Oct 01 12:47:15 crc kubenswrapper[4913]: E1001 12:47:15.532467 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532475 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: E1001 12:47:15.532487 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532495 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: E1001 12:47:15.532507 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532543 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 12:47:15 crc kubenswrapper[4913]: E1001 12:47:15.532558 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="nbdb" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532566 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="nbdb" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532722 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532741 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532749 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532806 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="kube-rbac-proxy-node" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532815 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="northd" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532823 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532831 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="nbdb" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532839 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovn-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532845 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532853 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="sbdb" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532860 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovn-acl-logging" Oct 01 12:47:15 crc kubenswrapper[4913]: E1001 12:47:15.532990 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.532998 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.533080 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerName="ovnkube-controller" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.534893 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.547737 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-node-log\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.547831 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-systemd\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.547830 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-node-log" (OuterVolumeSpecName: "node-log") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.547886 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-run-netns\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.547927 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.547954 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-env-overrides\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548010 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548076 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovn-node-metrics-cert\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548123 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-cni-netd\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548156 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548169 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-run-ovn-kubernetes\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548227 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548251 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-log-socket\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548234 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548294 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-log-socket" (OuterVolumeSpecName: "log-socket") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548308 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-ovn\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548327 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548335 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548357 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6nld\" (UniqueName: \"kubernetes.io/projected/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-kube-api-access-v6nld\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548377 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-systemd-units\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548394 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-kubelet\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548411 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-slash\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548425 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548428 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-openvswitch\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548444 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548442 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548464 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-slash" (OuterVolumeSpecName: "host-slash") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548506 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-cni-bin\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548533 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovnkube-script-lib\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548558 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-etc-openvswitch\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548589 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548587 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovnkube-config\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548633 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-var-lib-openvswitch\") pod \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\" (UID: \"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd\") " Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548732 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee2e8222-fa9d-4270-a809-3898f845d5ec-ovnkube-script-lib\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548754 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-run-ovn\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548778 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-cni-bin\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548792 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-run-netns\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548810 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-cni-netd\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548827 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-node-log\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548844 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-systemd-units\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548857 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee2e8222-fa9d-4270-a809-3898f845d5ec-env-overrides\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548872 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-run-systemd\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548889 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee2e8222-fa9d-4270-a809-3898f845d5ec-ovnkube-config\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548908 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee2e8222-fa9d-4270-a809-3898f845d5ec-ovn-node-metrics-cert\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548911 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548928 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-etc-openvswitch\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548952 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-slash\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548972 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-run-openvswitch\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.548988 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz8qp\" (UniqueName: \"kubernetes.io/projected/ee2e8222-fa9d-4270-a809-3898f845d5ec-kube-api-access-lz8qp\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549009 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549031 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-var-lib-openvswitch\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549035 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549049 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-log-socket\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549118 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549124 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-kubelet\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549162 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549279 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549333 4913 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549347 4913 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549356 4913 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549365 4913 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-node-log\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549374 4913 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549382 4913 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549392 4913 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549401 4913 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549410 4913 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549417 4913 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-log-socket\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549425 4913 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549433 4913 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549440 4913 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549449 4913 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-host-slash\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.549459 4913 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.552922 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-kube-api-access-v6nld" (OuterVolumeSpecName: "kube-api-access-v6nld") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "kube-api-access-v6nld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.553576 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.562990 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" (UID: "c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.650974 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-cni-netd\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651018 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-node-log\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651037 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-systemd-units\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651053 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee2e8222-fa9d-4270-a809-3898f845d5ec-env-overrides\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651071 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-run-systemd\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651080 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-node-log\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651122 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-run-systemd\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651086 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee2e8222-fa9d-4270-a809-3898f845d5ec-ovnkube-config\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651239 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee2e8222-fa9d-4270-a809-3898f845d5ec-ovn-node-metrics-cert\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651288 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-etc-openvswitch\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651330 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-slash\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651361 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-etc-openvswitch\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651364 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-run-openvswitch\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651391 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-run-openvswitch\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651428 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-slash\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651430 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz8qp\" (UniqueName: \"kubernetes.io/projected/ee2e8222-fa9d-4270-a809-3898f845d5ec-kube-api-access-lz8qp\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651457 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651479 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-var-lib-openvswitch\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651492 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-log-socket\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651539 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-var-lib-openvswitch\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651562 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651516 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-kubelet\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651605 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651623 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee2e8222-fa9d-4270-a809-3898f845d5ec-ovnkube-script-lib\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651622 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-log-socket\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651664 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651646 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-kubelet\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651691 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-run-ovn\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651711 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-cni-bin\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651729 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee2e8222-fa9d-4270-a809-3898f845d5ec-ovnkube-config\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651758 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-run-ovn\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651769 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-cni-bin\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651728 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-run-netns\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651789 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee2e8222-fa9d-4270-a809-3898f845d5ec-env-overrides\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651807 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-cni-netd\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651812 4913 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651823 4913 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651846 4913 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651848 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-host-run-netns\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651856 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6nld\" (UniqueName: \"kubernetes.io/projected/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-kube-api-access-v6nld\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651897 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee2e8222-fa9d-4270-a809-3898f845d5ec-systemd-units\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.651903 4913 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.652382 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee2e8222-fa9d-4270-a809-3898f845d5ec-ovnkube-script-lib\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.654826 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee2e8222-fa9d-4270-a809-3898f845d5ec-ovn-node-metrics-cert\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.665382 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz8qp\" (UniqueName: \"kubernetes.io/projected/ee2e8222-fa9d-4270-a809-3898f845d5ec-kube-api-access-lz8qp\") pod \"ovnkube-node-glx82\" (UID: \"ee2e8222-fa9d-4270-a809-3898f845d5ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.859712 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:15 crc kubenswrapper[4913]: W1001 12:47:15.887169 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee2e8222_fa9d_4270_a809_3898f845d5ec.slice/crio-6587c97f0a41d4a3faca37a58838a41cf9f7a550d4f81912b0c3801d409c0ef2 WatchSource:0}: Error finding container 6587c97f0a41d4a3faca37a58838a41cf9f7a550d4f81912b0c3801d409c0ef2: Status 404 returned error can't find the container with id 6587c97f0a41d4a3faca37a58838a41cf9f7a550d4f81912b0c3801d409c0ef2 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.894225 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovnkube-controller/3.log" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.898875 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovn-acl-logging/0.log" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.899687 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-57qvb_c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/ovn-controller/0.log" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900161 4913 generic.go:334] "Generic (PLEG): container finished" podID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerID="b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d" exitCode=0 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900204 4913 generic.go:334] "Generic (PLEG): container finished" podID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerID="78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7" exitCode=0 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900221 4913 generic.go:334] "Generic (PLEG): container finished" podID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerID="52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a" exitCode=0 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900236 4913 generic.go:334] "Generic (PLEG): container finished" podID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerID="8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a" exitCode=0 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900233 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerDied","Data":"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900259 4913 generic.go:334] "Generic (PLEG): container finished" podID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerID="df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8" exitCode=0 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900316 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerDied","Data":"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900323 4913 generic.go:334] "Generic (PLEG): container finished" podID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerID="b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1" exitCode=0 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900341 4913 generic.go:334] "Generic (PLEG): container finished" podID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerID="b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036" exitCode=143 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900355 4913 generic.go:334] "Generic (PLEG): container finished" podID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" containerID="9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f" exitCode=143 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900332 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerDied","Data":"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900343 4913 scope.go:117] "RemoveContainer" containerID="b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900413 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerDied","Data":"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900572 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerDied","Data":"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900592 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerDied","Data":"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900606 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900616 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900622 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900627 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900632 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900324 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900637 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900714 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900720 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900725 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900733 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerDied","Data":"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900741 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900748 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900754 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900760 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900766 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900772 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900778 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900784 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900789 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900793 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900800 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerDied","Data":"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900807 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900815 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900820 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900825 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900830 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900835 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900839 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900845 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900850 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900854 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900861 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-57qvb" event={"ID":"c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd","Type":"ContainerDied","Data":"ecf3d54b60b95b036bd08d19c112cecf401544bb6883215bcb0f7322a1c89609"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900868 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900878 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900885 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900889 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900894 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900900 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900905 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900911 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900915 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.900920 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.903293 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqn52_b2420adf-64bd-4d67-ac95-9337ed10149a/kube-multus/2.log" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.903820 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqn52_b2420adf-64bd-4d67-ac95-9337ed10149a/kube-multus/1.log" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.903857 4913 generic.go:334] "Generic (PLEG): container finished" podID="b2420adf-64bd-4d67-ac95-9337ed10149a" containerID="993223123ddc31a9f0505ba888d1ff1a8341a371ee37bc495e687819f372d309" exitCode=2 Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.903876 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqn52" event={"ID":"b2420adf-64bd-4d67-ac95-9337ed10149a","Type":"ContainerDied","Data":"993223123ddc31a9f0505ba888d1ff1a8341a371ee37bc495e687819f372d309"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.903888 4913 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a3180a2137b82f1695cd2f7e650f7d36760bc1fb62e77a131035c38edf8b9a9"} Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.904367 4913 scope.go:117] "RemoveContainer" containerID="993223123ddc31a9f0505ba888d1ff1a8341a371ee37bc495e687819f372d309" Oct 01 12:47:15 crc kubenswrapper[4913]: E1001 12:47:15.904523 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zqn52_openshift-multus(b2420adf-64bd-4d67-ac95-9337ed10149a)\"" pod="openshift-multus/multus-zqn52" podUID="b2420adf-64bd-4d67-ac95-9337ed10149a" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.924847 4913 scope.go:117] "RemoveContainer" containerID="090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.941300 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-57qvb"] Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.944597 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-57qvb"] Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.955031 4913 scope.go:117] "RemoveContainer" containerID="78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.977611 4913 scope.go:117] "RemoveContainer" containerID="52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a" Oct 01 12:47:15 crc kubenswrapper[4913]: I1001 12:47:15.996457 4913 scope.go:117] "RemoveContainer" containerID="8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.013954 4913 scope.go:117] "RemoveContainer" containerID="df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.083570 4913 scope.go:117] "RemoveContainer" containerID="b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.094294 4913 scope.go:117] "RemoveContainer" containerID="b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.109495 4913 scope.go:117] "RemoveContainer" containerID="9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.125449 4913 scope.go:117] "RemoveContainer" containerID="913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.159626 4913 scope.go:117] "RemoveContainer" containerID="b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d" Oct 01 12:47:16 crc kubenswrapper[4913]: E1001 12:47:16.160116 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d\": container with ID starting with b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d not found: ID does not exist" containerID="b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.160150 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d"} err="failed to get container status \"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d\": rpc error: code = NotFound desc = could not find container \"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d\": container with ID starting with b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.160170 4913 scope.go:117] "RemoveContainer" containerID="090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd" Oct 01 12:47:16 crc kubenswrapper[4913]: E1001 12:47:16.160739 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd\": container with ID starting with 090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd not found: ID does not exist" containerID="090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.160793 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd"} err="failed to get container status \"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd\": rpc error: code = NotFound desc = could not find container \"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd\": container with ID starting with 090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.160821 4913 scope.go:117] "RemoveContainer" containerID="78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7" Oct 01 12:47:16 crc kubenswrapper[4913]: E1001 12:47:16.161174 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\": container with ID starting with 78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7 not found: ID does not exist" containerID="78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.161205 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7"} err="failed to get container status \"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\": rpc error: code = NotFound desc = could not find container \"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\": container with ID starting with 78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.161220 4913 scope.go:117] "RemoveContainer" containerID="52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a" Oct 01 12:47:16 crc kubenswrapper[4913]: E1001 12:47:16.161464 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\": container with ID starting with 52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a not found: ID does not exist" containerID="52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.161487 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a"} err="failed to get container status \"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\": rpc error: code = NotFound desc = could not find container \"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\": container with ID starting with 52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.161522 4913 scope.go:117] "RemoveContainer" containerID="8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a" Oct 01 12:47:16 crc kubenswrapper[4913]: E1001 12:47:16.161823 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\": container with ID starting with 8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a not found: ID does not exist" containerID="8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.161850 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a"} err="failed to get container status \"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\": rpc error: code = NotFound desc = could not find container \"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\": container with ID starting with 8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.161864 4913 scope.go:117] "RemoveContainer" containerID="df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8" Oct 01 12:47:16 crc kubenswrapper[4913]: E1001 12:47:16.162229 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\": container with ID starting with df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8 not found: ID does not exist" containerID="df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.162305 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8"} err="failed to get container status \"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\": rpc error: code = NotFound desc = could not find container \"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\": container with ID starting with df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.162354 4913 scope.go:117] "RemoveContainer" containerID="b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1" Oct 01 12:47:16 crc kubenswrapper[4913]: E1001 12:47:16.162779 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\": container with ID starting with b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1 not found: ID does not exist" containerID="b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.162807 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1"} err="failed to get container status \"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\": rpc error: code = NotFound desc = could not find container \"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\": container with ID starting with b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.162823 4913 scope.go:117] "RemoveContainer" containerID="b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036" Oct 01 12:47:16 crc kubenswrapper[4913]: E1001 12:47:16.163075 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\": container with ID starting with b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036 not found: ID does not exist" containerID="b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.163096 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036"} err="failed to get container status \"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\": rpc error: code = NotFound desc = could not find container \"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\": container with ID starting with b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.163110 4913 scope.go:117] "RemoveContainer" containerID="9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f" Oct 01 12:47:16 crc kubenswrapper[4913]: E1001 12:47:16.163486 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\": container with ID starting with 9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f not found: ID does not exist" containerID="9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.163519 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f"} err="failed to get container status \"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\": rpc error: code = NotFound desc = could not find container \"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\": container with ID starting with 9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.163540 4913 scope.go:117] "RemoveContainer" containerID="913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a" Oct 01 12:47:16 crc kubenswrapper[4913]: E1001 12:47:16.163917 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\": container with ID starting with 913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a not found: ID does not exist" containerID="913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.163947 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a"} err="failed to get container status \"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\": rpc error: code = NotFound desc = could not find container \"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\": container with ID starting with 913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.163964 4913 scope.go:117] "RemoveContainer" containerID="b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.164222 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d"} err="failed to get container status \"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d\": rpc error: code = NotFound desc = could not find container \"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d\": container with ID starting with b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.164241 4913 scope.go:117] "RemoveContainer" containerID="090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.164480 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd"} err="failed to get container status \"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd\": rpc error: code = NotFound desc = could not find container \"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd\": container with ID starting with 090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.164507 4913 scope.go:117] "RemoveContainer" containerID="78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.164808 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7"} err="failed to get container status \"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\": rpc error: code = NotFound desc = could not find container \"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\": container with ID starting with 78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.164847 4913 scope.go:117] "RemoveContainer" containerID="52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.165053 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a"} err="failed to get container status \"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\": rpc error: code = NotFound desc = could not find container \"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\": container with ID starting with 52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.165084 4913 scope.go:117] "RemoveContainer" containerID="8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.165341 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a"} err="failed to get container status \"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\": rpc error: code = NotFound desc = could not find container \"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\": container with ID starting with 8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.165376 4913 scope.go:117] "RemoveContainer" containerID="df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.165557 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8"} err="failed to get container status \"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\": rpc error: code = NotFound desc = could not find container \"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\": container with ID starting with df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.165576 4913 scope.go:117] "RemoveContainer" containerID="b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.165786 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1"} err="failed to get container status \"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\": rpc error: code = NotFound desc = could not find container \"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\": container with ID starting with b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.165807 4913 scope.go:117] "RemoveContainer" containerID="b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.166293 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036"} err="failed to get container status \"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\": rpc error: code = NotFound desc = could not find container \"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\": container with ID starting with b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.166312 4913 scope.go:117] "RemoveContainer" containerID="9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.166545 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f"} err="failed to get container status \"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\": rpc error: code = NotFound desc = could not find container \"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\": container with ID starting with 9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.166565 4913 scope.go:117] "RemoveContainer" containerID="913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.166829 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a"} err="failed to get container status \"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\": rpc error: code = NotFound desc = could not find container \"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\": container with ID starting with 913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.166846 4913 scope.go:117] "RemoveContainer" containerID="b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.167254 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d"} err="failed to get container status \"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d\": rpc error: code = NotFound desc = could not find container \"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d\": container with ID starting with b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.167287 4913 scope.go:117] "RemoveContainer" containerID="090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.167648 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd"} err="failed to get container status \"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd\": rpc error: code = NotFound desc = could not find container \"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd\": container with ID starting with 090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.167669 4913 scope.go:117] "RemoveContainer" containerID="78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.167897 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7"} err="failed to get container status \"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\": rpc error: code = NotFound desc = could not find container \"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\": container with ID starting with 78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.167940 4913 scope.go:117] "RemoveContainer" containerID="52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.168250 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a"} err="failed to get container status \"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\": rpc error: code = NotFound desc = could not find container \"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\": container with ID starting with 52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.168550 4913 scope.go:117] "RemoveContainer" containerID="8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.168804 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a"} err="failed to get container status \"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\": rpc error: code = NotFound desc = could not find container \"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\": container with ID starting with 8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.168850 4913 scope.go:117] "RemoveContainer" containerID="df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.169178 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8"} err="failed to get container status \"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\": rpc error: code = NotFound desc = could not find container \"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\": container with ID starting with df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.169207 4913 scope.go:117] "RemoveContainer" containerID="b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.169569 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1"} err="failed to get container status \"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\": rpc error: code = NotFound desc = could not find container \"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\": container with ID starting with b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.169592 4913 scope.go:117] "RemoveContainer" containerID="b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.169870 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036"} err="failed to get container status \"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\": rpc error: code = NotFound desc = could not find container \"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\": container with ID starting with b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.169893 4913 scope.go:117] "RemoveContainer" containerID="9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.170233 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f"} err="failed to get container status \"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\": rpc error: code = NotFound desc = could not find container \"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\": container with ID starting with 9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.170673 4913 scope.go:117] "RemoveContainer" containerID="913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.170961 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a"} err="failed to get container status \"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\": rpc error: code = NotFound desc = could not find container \"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\": container with ID starting with 913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.170978 4913 scope.go:117] "RemoveContainer" containerID="b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.171653 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d"} err="failed to get container status \"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d\": rpc error: code = NotFound desc = could not find container \"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d\": container with ID starting with b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.171672 4913 scope.go:117] "RemoveContainer" containerID="090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.171948 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd"} err="failed to get container status \"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd\": rpc error: code = NotFound desc = could not find container \"090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd\": container with ID starting with 090fa33aba0e8add115c1b0ce5c479aa9bb137d3d4705625be1f980ecc6c8ffd not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.171976 4913 scope.go:117] "RemoveContainer" containerID="78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.172230 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7"} err="failed to get container status \"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\": rpc error: code = NotFound desc = could not find container \"78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7\": container with ID starting with 78dd2aa6cd9931313de325c82680f4faeeb19916d0e55a3aba69bb23f3c056f7 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.172252 4913 scope.go:117] "RemoveContainer" containerID="52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.172620 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a"} err="failed to get container status \"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\": rpc error: code = NotFound desc = could not find container \"52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a\": container with ID starting with 52bd6825c2b4a90d4c9ef11c4bd577e27e526db5352ff4b39e2f6b538e21456a not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.172644 4913 scope.go:117] "RemoveContainer" containerID="8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.173739 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a"} err="failed to get container status \"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\": rpc error: code = NotFound desc = could not find container \"8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a\": container with ID starting with 8d9bbbebad8c1849462d7012fc2c2f9ed4026174f72c6af0b7c489407b4bc22a not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.173780 4913 scope.go:117] "RemoveContainer" containerID="df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.175631 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8"} err="failed to get container status \"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\": rpc error: code = NotFound desc = could not find container \"df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8\": container with ID starting with df04394ac982646ff332aa423544994f73fdcae0efb8200bddbb77a1a403ace8 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.175658 4913 scope.go:117] "RemoveContainer" containerID="b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.176025 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1"} err="failed to get container status \"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\": rpc error: code = NotFound desc = could not find container \"b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1\": container with ID starting with b4ad06b2d515d93bdbc631a98039db8fda7b597eb928e922d1e979a862f21de1 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.176076 4913 scope.go:117] "RemoveContainer" containerID="b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.177023 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036"} err="failed to get container status \"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\": rpc error: code = NotFound desc = could not find container \"b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036\": container with ID starting with b385ee067e9acb006f693060a02fd1a206885538c22849dabfb65ce6152ff036 not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.177047 4913 scope.go:117] "RemoveContainer" containerID="9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.177403 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f"} err="failed to get container status \"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\": rpc error: code = NotFound desc = could not find container \"9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f\": container with ID starting with 9a46ef9b7ac4da9e4774f09316cc81aa5ec19dc01b32f2a7d00ab5fb373e552f not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.177453 4913 scope.go:117] "RemoveContainer" containerID="913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.178395 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a"} err="failed to get container status \"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\": rpc error: code = NotFound desc = could not find container \"913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a\": container with ID starting with 913e7576f8569764287d3832e2a4e68e1f9c5eb09bcf063f785cdf7dfe67d58a not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.178414 4913 scope.go:117] "RemoveContainer" containerID="b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.178865 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d"} err="failed to get container status \"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d\": rpc error: code = NotFound desc = could not find container \"b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d\": container with ID starting with b44e043c032bdaa5daccf3269a90cdb19cf14a9d2c81f0956bf399bbcfd3996d not found: ID does not exist" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.814239 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd" path="/var/lib/kubelet/pods/c6e6adf1-250b-4f6a-94da-8e3ad2cee3bd/volumes" Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.911464 4913 generic.go:334] "Generic (PLEG): container finished" podID="ee2e8222-fa9d-4270-a809-3898f845d5ec" containerID="6ee910fb948f26dda15136341835f43c464dbc72ea580704b8cc4b71e2f224eb" exitCode=0 Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.911555 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" event={"ID":"ee2e8222-fa9d-4270-a809-3898f845d5ec","Type":"ContainerDied","Data":"6ee910fb948f26dda15136341835f43c464dbc72ea580704b8cc4b71e2f224eb"} Oct 01 12:47:16 crc kubenswrapper[4913]: I1001 12:47:16.911604 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" event={"ID":"ee2e8222-fa9d-4270-a809-3898f845d5ec","Type":"ContainerStarted","Data":"6587c97f0a41d4a3faca37a58838a41cf9f7a550d4f81912b0c3801d409c0ef2"} Oct 01 12:47:17 crc kubenswrapper[4913]: I1001 12:47:17.919340 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" event={"ID":"ee2e8222-fa9d-4270-a809-3898f845d5ec","Type":"ContainerStarted","Data":"9fd3b34c44adc73aa6c1b79ff2c95215d48ff6aabfeea5ab00f8cd5ae0f3b9d1"} Oct 01 12:47:17 crc kubenswrapper[4913]: I1001 12:47:17.919631 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" event={"ID":"ee2e8222-fa9d-4270-a809-3898f845d5ec","Type":"ContainerStarted","Data":"4cd0c05cfd5875c66c93aac5f65070e951d7d7e74bd957ec6aa91fc3e92f029e"} Oct 01 12:47:17 crc kubenswrapper[4913]: I1001 12:47:17.919641 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" event={"ID":"ee2e8222-fa9d-4270-a809-3898f845d5ec","Type":"ContainerStarted","Data":"c2366ea2191584c967eb4b84cdb4d47cdc3d3d4c41b90862e099c31e83758438"} Oct 01 12:47:17 crc kubenswrapper[4913]: I1001 12:47:17.919650 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" event={"ID":"ee2e8222-fa9d-4270-a809-3898f845d5ec","Type":"ContainerStarted","Data":"5a964e2d66e4cb64aa1a90a2bd755ae44aa23b843d1d9e3197f8f6d18741d8f6"} Oct 01 12:47:17 crc kubenswrapper[4913]: I1001 12:47:17.919659 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" event={"ID":"ee2e8222-fa9d-4270-a809-3898f845d5ec","Type":"ContainerStarted","Data":"0dd352e812dc156af384ba28a1c1204a25e5deed75897b9b686abf802ec6c486"} Oct 01 12:47:17 crc kubenswrapper[4913]: I1001 12:47:17.919667 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" event={"ID":"ee2e8222-fa9d-4270-a809-3898f845d5ec","Type":"ContainerStarted","Data":"a1916eb165f690bde34e8e08f84e953612f156e56c401dd32a63ee2eb7535ffb"} Oct 01 12:47:19 crc kubenswrapper[4913]: I1001 12:47:19.933767 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" event={"ID":"ee2e8222-fa9d-4270-a809-3898f845d5ec","Type":"ContainerStarted","Data":"943c7aa1e2b4cb1c178669eb5f09abac20a6b8a2aaa959629208a4ce75ad315c"} Oct 01 12:47:22 crc kubenswrapper[4913]: I1001 12:47:22.953593 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" event={"ID":"ee2e8222-fa9d-4270-a809-3898f845d5ec","Type":"ContainerStarted","Data":"d1b2bfc04f0e416aef12b2d228a0af4b8170fec1dc3b5ba7def6096b970eff3d"} Oct 01 12:47:22 crc kubenswrapper[4913]: I1001 12:47:22.954033 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:22 crc kubenswrapper[4913]: I1001 12:47:22.954048 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:22 crc kubenswrapper[4913]: I1001 12:47:22.980514 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" podStartSLOduration=7.980448007 podStartE2EDuration="7.980448007s" podCreationTimestamp="2025-10-01 12:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:47:22.976180789 +0000 UTC m=+574.879656407" watchObservedRunningTime="2025-10-01 12:47:22.980448007 +0000 UTC m=+574.883923585" Oct 01 12:47:22 crc kubenswrapper[4913]: I1001 12:47:22.986163 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:23 crc kubenswrapper[4913]: I1001 12:47:23.957973 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:23 crc kubenswrapper[4913]: I1001 12:47:23.983119 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:29 crc kubenswrapper[4913]: I1001 12:47:29.806943 4913 scope.go:117] "RemoveContainer" containerID="993223123ddc31a9f0505ba888d1ff1a8341a371ee37bc495e687819f372d309" Oct 01 12:47:29 crc kubenswrapper[4913]: E1001 12:47:29.807855 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zqn52_openshift-multus(b2420adf-64bd-4d67-ac95-9337ed10149a)\"" pod="openshift-multus/multus-zqn52" podUID="b2420adf-64bd-4d67-ac95-9337ed10149a" Oct 01 12:47:40 crc kubenswrapper[4913]: I1001 12:47:40.083782 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:47:40 crc kubenswrapper[4913]: I1001 12:47:40.084279 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:47:40 crc kubenswrapper[4913]: I1001 12:47:40.084321 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:47:40 crc kubenswrapper[4913]: I1001 12:47:40.084828 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1cc68559427fcac3c481bc75066fa47eb6ec40478fc203e9f25d7c355d20a2fd"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:47:40 crc kubenswrapper[4913]: I1001 12:47:40.084884 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://1cc68559427fcac3c481bc75066fa47eb6ec40478fc203e9f25d7c355d20a2fd" gracePeriod=600 Oct 01 12:47:41 crc kubenswrapper[4913]: I1001 12:47:41.054907 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="1cc68559427fcac3c481bc75066fa47eb6ec40478fc203e9f25d7c355d20a2fd" exitCode=0 Oct 01 12:47:41 crc kubenswrapper[4913]: I1001 12:47:41.054991 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"1cc68559427fcac3c481bc75066fa47eb6ec40478fc203e9f25d7c355d20a2fd"} Oct 01 12:47:41 crc kubenswrapper[4913]: I1001 12:47:41.055382 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"770bb111d4d76e645ce0db85174ab71c7357ffc9ba302bee6b549ccfcb148bea"} Oct 01 12:47:41 crc kubenswrapper[4913]: I1001 12:47:41.055416 4913 scope.go:117] "RemoveContainer" containerID="96dd2b868a1064bbecfc8916fb08a36877895d66c2075b2711ca53f620f29f26" Oct 01 12:47:44 crc kubenswrapper[4913]: I1001 12:47:44.806873 4913 scope.go:117] "RemoveContainer" containerID="993223123ddc31a9f0505ba888d1ff1a8341a371ee37bc495e687819f372d309" Oct 01 12:47:45 crc kubenswrapper[4913]: I1001 12:47:45.084940 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqn52_b2420adf-64bd-4d67-ac95-9337ed10149a/kube-multus/2.log" Oct 01 12:47:45 crc kubenswrapper[4913]: I1001 12:47:45.085898 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqn52_b2420adf-64bd-4d67-ac95-9337ed10149a/kube-multus/1.log" Oct 01 12:47:45 crc kubenswrapper[4913]: I1001 12:47:45.085969 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqn52" event={"ID":"b2420adf-64bd-4d67-ac95-9337ed10149a","Type":"ContainerStarted","Data":"8f9b3ea9fe2f35445104f35f3cd2cc22f5d4970c21d2e74568acd20728e2a15d"} Oct 01 12:47:45 crc kubenswrapper[4913]: I1001 12:47:45.881065 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-glx82" Oct 01 12:47:48 crc kubenswrapper[4913]: I1001 12:47:48.950788 4913 scope.go:117] "RemoveContainer" containerID="4a3180a2137b82f1695cd2f7e650f7d36760bc1fb62e77a131035c38edf8b9a9" Oct 01 12:47:49 crc kubenswrapper[4913]: I1001 12:47:49.108536 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqn52_b2420adf-64bd-4d67-ac95-9337ed10149a/kube-multus/2.log" Oct 01 12:47:56 crc kubenswrapper[4913]: I1001 12:47:56.844506 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg"] Oct 01 12:47:56 crc kubenswrapper[4913]: I1001 12:47:56.846708 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" Oct 01 12:47:56 crc kubenswrapper[4913]: I1001 12:47:56.849124 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 12:47:56 crc kubenswrapper[4913]: I1001 12:47:56.862977 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg"] Oct 01 12:47:56 crc kubenswrapper[4913]: I1001 12:47:56.983569 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg\" (UID: \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" Oct 01 12:47:56 crc kubenswrapper[4913]: I1001 12:47:56.983675 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg\" (UID: \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" Oct 01 12:47:56 crc kubenswrapper[4913]: I1001 12:47:56.984337 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk2xc\" (UniqueName: \"kubernetes.io/projected/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-kube-api-access-bk2xc\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg\" (UID: \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" Oct 01 12:47:57 crc kubenswrapper[4913]: I1001 12:47:57.085406 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg\" (UID: \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" Oct 01 12:47:57 crc kubenswrapper[4913]: I1001 12:47:57.085479 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg\" (UID: \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" Oct 01 12:47:57 crc kubenswrapper[4913]: I1001 12:47:57.085602 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk2xc\" (UniqueName: \"kubernetes.io/projected/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-kube-api-access-bk2xc\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg\" (UID: \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" Oct 01 12:47:57 crc kubenswrapper[4913]: I1001 12:47:57.085876 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg\" (UID: \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" Oct 01 12:47:57 crc kubenswrapper[4913]: I1001 12:47:57.086196 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg\" (UID: \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" Oct 01 12:47:57 crc kubenswrapper[4913]: I1001 12:47:57.121841 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk2xc\" (UniqueName: \"kubernetes.io/projected/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-kube-api-access-bk2xc\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg\" (UID: \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" Oct 01 12:47:57 crc kubenswrapper[4913]: I1001 12:47:57.164120 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" Oct 01 12:47:57 crc kubenswrapper[4913]: I1001 12:47:57.448493 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg"] Oct 01 12:47:58 crc kubenswrapper[4913]: I1001 12:47:58.168947 4913 generic.go:334] "Generic (PLEG): container finished" podID="e3a0a677-1707-4c62-8561-4b1fa7ac7b43" containerID="b313c9aa9ea10e281dac6eed86523822723a55b889c2739d284429fab9f77ac2" exitCode=0 Oct 01 12:47:58 crc kubenswrapper[4913]: I1001 12:47:58.169006 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" event={"ID":"e3a0a677-1707-4c62-8561-4b1fa7ac7b43","Type":"ContainerDied","Data":"b313c9aa9ea10e281dac6eed86523822723a55b889c2739d284429fab9f77ac2"} Oct 01 12:47:58 crc kubenswrapper[4913]: I1001 12:47:58.169052 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" event={"ID":"e3a0a677-1707-4c62-8561-4b1fa7ac7b43","Type":"ContainerStarted","Data":"639ca49bf11518fce0879543cde071a8cd765d4457f867f2a96b93034536f8bf"} Oct 01 12:48:00 crc kubenswrapper[4913]: I1001 12:48:00.183899 4913 generic.go:334] "Generic (PLEG): container finished" podID="e3a0a677-1707-4c62-8561-4b1fa7ac7b43" containerID="d1e9cca2886d8961c5b9bdf2b8b0bfa2ded86ab8a3a5f4e0705ca3b137fda354" exitCode=0 Oct 01 12:48:00 crc kubenswrapper[4913]: I1001 12:48:00.184014 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" event={"ID":"e3a0a677-1707-4c62-8561-4b1fa7ac7b43","Type":"ContainerDied","Data":"d1e9cca2886d8961c5b9bdf2b8b0bfa2ded86ab8a3a5f4e0705ca3b137fda354"} Oct 01 12:48:01 crc kubenswrapper[4913]: I1001 12:48:01.194224 4913 generic.go:334] "Generic (PLEG): container finished" podID="e3a0a677-1707-4c62-8561-4b1fa7ac7b43" containerID="94bc3e5e5a0cb3bb7b2455c3d28272a9ee729c1d7156a92cf3bdb9dff862f5ce" exitCode=0 Oct 01 12:48:01 crc kubenswrapper[4913]: I1001 12:48:01.194299 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" event={"ID":"e3a0a677-1707-4c62-8561-4b1fa7ac7b43","Type":"ContainerDied","Data":"94bc3e5e5a0cb3bb7b2455c3d28272a9ee729c1d7156a92cf3bdb9dff862f5ce"} Oct 01 12:48:02 crc kubenswrapper[4913]: I1001 12:48:02.497632 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" Oct 01 12:48:02 crc kubenswrapper[4913]: I1001 12:48:02.662604 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-util\") pod \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\" (UID: \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\") " Oct 01 12:48:02 crc kubenswrapper[4913]: I1001 12:48:02.662757 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-bundle\") pod \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\" (UID: \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\") " Oct 01 12:48:02 crc kubenswrapper[4913]: I1001 12:48:02.663518 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk2xc\" (UniqueName: \"kubernetes.io/projected/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-kube-api-access-bk2xc\") pod \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\" (UID: \"e3a0a677-1707-4c62-8561-4b1fa7ac7b43\") " Oct 01 12:48:02 crc kubenswrapper[4913]: I1001 12:48:02.663587 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-bundle" (OuterVolumeSpecName: "bundle") pod "e3a0a677-1707-4c62-8561-4b1fa7ac7b43" (UID: "e3a0a677-1707-4c62-8561-4b1fa7ac7b43"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:48:02 crc kubenswrapper[4913]: I1001 12:48:02.663773 4913 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:02 crc kubenswrapper[4913]: I1001 12:48:02.675546 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-kube-api-access-bk2xc" (OuterVolumeSpecName: "kube-api-access-bk2xc") pod "e3a0a677-1707-4c62-8561-4b1fa7ac7b43" (UID: "e3a0a677-1707-4c62-8561-4b1fa7ac7b43"). InnerVolumeSpecName "kube-api-access-bk2xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:48:02 crc kubenswrapper[4913]: I1001 12:48:02.690576 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-util" (OuterVolumeSpecName: "util") pod "e3a0a677-1707-4c62-8561-4b1fa7ac7b43" (UID: "e3a0a677-1707-4c62-8561-4b1fa7ac7b43"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:48:02 crc kubenswrapper[4913]: I1001 12:48:02.764831 4913 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-util\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:02 crc kubenswrapper[4913]: I1001 12:48:02.764881 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk2xc\" (UniqueName: \"kubernetes.io/projected/e3a0a677-1707-4c62-8561-4b1fa7ac7b43-kube-api-access-bk2xc\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:03 crc kubenswrapper[4913]: I1001 12:48:03.209089 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" event={"ID":"e3a0a677-1707-4c62-8561-4b1fa7ac7b43","Type":"ContainerDied","Data":"639ca49bf11518fce0879543cde071a8cd765d4457f867f2a96b93034536f8bf"} Oct 01 12:48:03 crc kubenswrapper[4913]: I1001 12:48:03.209133 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="639ca49bf11518fce0879543cde071a8cd765d4457f867f2a96b93034536f8bf" Oct 01 12:48:03 crc kubenswrapper[4913]: I1001 12:48:03.209160 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg" Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.479126 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-s6pxh"] Oct 01 12:48:04 crc kubenswrapper[4913]: E1001 12:48:04.479561 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a0a677-1707-4c62-8561-4b1fa7ac7b43" containerName="util" Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.479585 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a0a677-1707-4c62-8561-4b1fa7ac7b43" containerName="util" Oct 01 12:48:04 crc kubenswrapper[4913]: E1001 12:48:04.479674 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a0a677-1707-4c62-8561-4b1fa7ac7b43" containerName="pull" Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.479687 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a0a677-1707-4c62-8561-4b1fa7ac7b43" containerName="pull" Oct 01 12:48:04 crc kubenswrapper[4913]: E1001 12:48:04.479704 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a0a677-1707-4c62-8561-4b1fa7ac7b43" containerName="extract" Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.479719 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a0a677-1707-4c62-8561-4b1fa7ac7b43" containerName="extract" Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.479960 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a0a677-1707-4c62-8561-4b1fa7ac7b43" containerName="extract" Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.480701 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6pxh" Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.483225 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.483376 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.492317 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-s6pxh"] Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.494378 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-9rv88" Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.585092 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npq9g\" (UniqueName: \"kubernetes.io/projected/652be88b-3bb3-4c4d-9aa7-bc1494c53cb3-kube-api-access-npq9g\") pod \"nmstate-operator-5d6f6cfd66-s6pxh\" (UID: \"652be88b-3bb3-4c4d-9aa7-bc1494c53cb3\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6pxh" Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.687425 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npq9g\" (UniqueName: \"kubernetes.io/projected/652be88b-3bb3-4c4d-9aa7-bc1494c53cb3-kube-api-access-npq9g\") pod \"nmstate-operator-5d6f6cfd66-s6pxh\" (UID: \"652be88b-3bb3-4c4d-9aa7-bc1494c53cb3\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6pxh" Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.713923 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npq9g\" (UniqueName: \"kubernetes.io/projected/652be88b-3bb3-4c4d-9aa7-bc1494c53cb3-kube-api-access-npq9g\") pod \"nmstate-operator-5d6f6cfd66-s6pxh\" (UID: \"652be88b-3bb3-4c4d-9aa7-bc1494c53cb3\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6pxh" Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.799003 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6pxh" Oct 01 12:48:04 crc kubenswrapper[4913]: I1001 12:48:04.992668 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-s6pxh"] Oct 01 12:48:05 crc kubenswrapper[4913]: W1001 12:48:05.015411 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod652be88b_3bb3_4c4d_9aa7_bc1494c53cb3.slice/crio-6ded7cf94df30869cf17441c3e09cfd422b86a695520e2f96ad65018fcb59959 WatchSource:0}: Error finding container 6ded7cf94df30869cf17441c3e09cfd422b86a695520e2f96ad65018fcb59959: Status 404 returned error can't find the container with id 6ded7cf94df30869cf17441c3e09cfd422b86a695520e2f96ad65018fcb59959 Oct 01 12:48:05 crc kubenswrapper[4913]: I1001 12:48:05.218024 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6pxh" event={"ID":"652be88b-3bb3-4c4d-9aa7-bc1494c53cb3","Type":"ContainerStarted","Data":"6ded7cf94df30869cf17441c3e09cfd422b86a695520e2f96ad65018fcb59959"} Oct 01 12:48:07 crc kubenswrapper[4913]: I1001 12:48:07.229053 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6pxh" event={"ID":"652be88b-3bb3-4c4d-9aa7-bc1494c53cb3","Type":"ContainerStarted","Data":"64cba8586e4603fe0ef4c0057ffcb0fc49149cd5073141f81107020b1089e64d"} Oct 01 12:48:07 crc kubenswrapper[4913]: I1001 12:48:07.249102 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-s6pxh" podStartSLOduration=1.200025387 podStartE2EDuration="3.249085759s" podCreationTimestamp="2025-10-01 12:48:04 +0000 UTC" firstStartedPulling="2025-10-01 12:48:05.017811797 +0000 UTC m=+616.921287375" lastFinishedPulling="2025-10-01 12:48:07.066872169 +0000 UTC m=+618.970347747" observedRunningTime="2025-10-01 12:48:07.243773632 +0000 UTC m=+619.147249240" watchObservedRunningTime="2025-10-01 12:48:07.249085759 +0000 UTC m=+619.152561347" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.281655 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-r6h59"] Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.283065 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-r6h59" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.285606 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-4j7r8" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.290332 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-bnx22"] Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.291088 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-bnx22" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.296191 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.299832 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-r6h59"] Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.309526 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-bnx22"] Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.343855 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wv22\" (UniqueName: \"kubernetes.io/projected/bad76a8b-0b0d-4a6e-870d-14f138beb4fb-kube-api-access-7wv22\") pod \"nmstate-webhook-6d689559c5-bnx22\" (UID: \"bad76a8b-0b0d-4a6e-870d-14f138beb4fb\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-bnx22" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.344185 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bad76a8b-0b0d-4a6e-870d-14f138beb4fb-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-bnx22\" (UID: \"bad76a8b-0b0d-4a6e-870d-14f138beb4fb\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-bnx22" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.350313 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-trqgg"] Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.351085 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.409677 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft"] Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.410379 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.411887 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.412169 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-4dmzp" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.414329 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.432373 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft"] Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.445244 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svlsk\" (UniqueName: \"kubernetes.io/projected/e6565ecd-6027-4555-888c-da3a16c20260-kube-api-access-svlsk\") pod \"nmstate-console-plugin-864bb6dfb5-nh8ft\" (UID: \"e6565ecd-6027-4555-888c-da3a16c20260\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.445385 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0ca9e5b5-947e-426a-81cc-8ce9774da263-dbus-socket\") pod \"nmstate-handler-trqgg\" (UID: \"0ca9e5b5-947e-426a-81cc-8ce9774da263\") " pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.445436 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6565ecd-6027-4555-888c-da3a16c20260-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-nh8ft\" (UID: \"e6565ecd-6027-4555-888c-da3a16c20260\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.445469 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bad76a8b-0b0d-4a6e-870d-14f138beb4fb-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-bnx22\" (UID: \"bad76a8b-0b0d-4a6e-870d-14f138beb4fb\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-bnx22" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.445514 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt8x5\" (UniqueName: \"kubernetes.io/projected/86077535-3ad3-414d-ad9d-5b0107ec2cf0-kube-api-access-lt8x5\") pod \"nmstate-metrics-58fcddf996-r6h59\" (UID: \"86077535-3ad3-414d-ad9d-5b0107ec2cf0\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-r6h59" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.445588 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9p9m\" (UniqueName: \"kubernetes.io/projected/0ca9e5b5-947e-426a-81cc-8ce9774da263-kube-api-access-c9p9m\") pod \"nmstate-handler-trqgg\" (UID: \"0ca9e5b5-947e-426a-81cc-8ce9774da263\") " pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:08 crc kubenswrapper[4913]: E1001 12:48:08.445624 4913 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.445635 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0ca9e5b5-947e-426a-81cc-8ce9774da263-ovs-socket\") pod \"nmstate-handler-trqgg\" (UID: \"0ca9e5b5-947e-426a-81cc-8ce9774da263\") " pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:08 crc kubenswrapper[4913]: E1001 12:48:08.445677 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad76a8b-0b0d-4a6e-870d-14f138beb4fb-tls-key-pair podName:bad76a8b-0b0d-4a6e-870d-14f138beb4fb nodeName:}" failed. No retries permitted until 2025-10-01 12:48:08.945659858 +0000 UTC m=+620.849135436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/bad76a8b-0b0d-4a6e-870d-14f138beb4fb-tls-key-pair") pod "nmstate-webhook-6d689559c5-bnx22" (UID: "bad76a8b-0b0d-4a6e-870d-14f138beb4fb") : secret "openshift-nmstate-webhook" not found Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.445692 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0ca9e5b5-947e-426a-81cc-8ce9774da263-nmstate-lock\") pod \"nmstate-handler-trqgg\" (UID: \"0ca9e5b5-947e-426a-81cc-8ce9774da263\") " pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.445718 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wv22\" (UniqueName: \"kubernetes.io/projected/bad76a8b-0b0d-4a6e-870d-14f138beb4fb-kube-api-access-7wv22\") pod \"nmstate-webhook-6d689559c5-bnx22\" (UID: \"bad76a8b-0b0d-4a6e-870d-14f138beb4fb\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-bnx22" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.445763 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e6565ecd-6027-4555-888c-da3a16c20260-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-nh8ft\" (UID: \"e6565ecd-6027-4555-888c-da3a16c20260\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.463181 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wv22\" (UniqueName: \"kubernetes.io/projected/bad76a8b-0b0d-4a6e-870d-14f138beb4fb-kube-api-access-7wv22\") pod \"nmstate-webhook-6d689559c5-bnx22\" (UID: \"bad76a8b-0b0d-4a6e-870d-14f138beb4fb\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-bnx22" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.546667 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt8x5\" (UniqueName: \"kubernetes.io/projected/86077535-3ad3-414d-ad9d-5b0107ec2cf0-kube-api-access-lt8x5\") pod \"nmstate-metrics-58fcddf996-r6h59\" (UID: \"86077535-3ad3-414d-ad9d-5b0107ec2cf0\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-r6h59" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.546734 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9p9m\" (UniqueName: \"kubernetes.io/projected/0ca9e5b5-947e-426a-81cc-8ce9774da263-kube-api-access-c9p9m\") pod \"nmstate-handler-trqgg\" (UID: \"0ca9e5b5-947e-426a-81cc-8ce9774da263\") " pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.546774 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0ca9e5b5-947e-426a-81cc-8ce9774da263-ovs-socket\") pod \"nmstate-handler-trqgg\" (UID: \"0ca9e5b5-947e-426a-81cc-8ce9774da263\") " pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.546796 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0ca9e5b5-947e-426a-81cc-8ce9774da263-nmstate-lock\") pod \"nmstate-handler-trqgg\" (UID: \"0ca9e5b5-947e-426a-81cc-8ce9774da263\") " pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.546831 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e6565ecd-6027-4555-888c-da3a16c20260-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-nh8ft\" (UID: \"e6565ecd-6027-4555-888c-da3a16c20260\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.546860 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svlsk\" (UniqueName: \"kubernetes.io/projected/e6565ecd-6027-4555-888c-da3a16c20260-kube-api-access-svlsk\") pod \"nmstate-console-plugin-864bb6dfb5-nh8ft\" (UID: \"e6565ecd-6027-4555-888c-da3a16c20260\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.546917 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0ca9e5b5-947e-426a-81cc-8ce9774da263-dbus-socket\") pod \"nmstate-handler-trqgg\" (UID: \"0ca9e5b5-947e-426a-81cc-8ce9774da263\") " pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.546950 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6565ecd-6027-4555-888c-da3a16c20260-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-nh8ft\" (UID: \"e6565ecd-6027-4555-888c-da3a16c20260\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.546996 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0ca9e5b5-947e-426a-81cc-8ce9774da263-nmstate-lock\") pod \"nmstate-handler-trqgg\" (UID: \"0ca9e5b5-947e-426a-81cc-8ce9774da263\") " pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:08 crc kubenswrapper[4913]: E1001 12:48:08.547068 4913 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 01 12:48:08 crc kubenswrapper[4913]: E1001 12:48:08.547116 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6565ecd-6027-4555-888c-da3a16c20260-plugin-serving-cert podName:e6565ecd-6027-4555-888c-da3a16c20260 nodeName:}" failed. No retries permitted until 2025-10-01 12:48:09.047099621 +0000 UTC m=+620.950575199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/e6565ecd-6027-4555-888c-da3a16c20260-plugin-serving-cert") pod "nmstate-console-plugin-864bb6dfb5-nh8ft" (UID: "e6565ecd-6027-4555-888c-da3a16c20260") : secret "plugin-serving-cert" not found Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.547163 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0ca9e5b5-947e-426a-81cc-8ce9774da263-ovs-socket\") pod \"nmstate-handler-trqgg\" (UID: \"0ca9e5b5-947e-426a-81cc-8ce9774da263\") " pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.547488 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0ca9e5b5-947e-426a-81cc-8ce9774da263-dbus-socket\") pod \"nmstate-handler-trqgg\" (UID: \"0ca9e5b5-947e-426a-81cc-8ce9774da263\") " pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.548088 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e6565ecd-6027-4555-888c-da3a16c20260-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-nh8ft\" (UID: \"e6565ecd-6027-4555-888c-da3a16c20260\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.564063 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svlsk\" (UniqueName: \"kubernetes.io/projected/e6565ecd-6027-4555-888c-da3a16c20260-kube-api-access-svlsk\") pod \"nmstate-console-plugin-864bb6dfb5-nh8ft\" (UID: \"e6565ecd-6027-4555-888c-da3a16c20260\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.566829 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9p9m\" (UniqueName: \"kubernetes.io/projected/0ca9e5b5-947e-426a-81cc-8ce9774da263-kube-api-access-c9p9m\") pod \"nmstate-handler-trqgg\" (UID: \"0ca9e5b5-947e-426a-81cc-8ce9774da263\") " pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.590049 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt8x5\" (UniqueName: \"kubernetes.io/projected/86077535-3ad3-414d-ad9d-5b0107ec2cf0-kube-api-access-lt8x5\") pod \"nmstate-metrics-58fcddf996-r6h59\" (UID: \"86077535-3ad3-414d-ad9d-5b0107ec2cf0\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-r6h59" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.609603 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-797fd58d79-l2gb5"] Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.610410 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.613613 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-r6h59" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.627632 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-797fd58d79-l2gb5"] Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.666498 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.751113 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-console-config\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.751412 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-console-serving-cert\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.751430 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-console-oauth-config\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.751449 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-trusted-ca-bundle\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.751487 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jsw9\" (UniqueName: \"kubernetes.io/projected/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-kube-api-access-4jsw9\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.751514 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-service-ca\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.751530 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-oauth-serving-cert\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.852081 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jsw9\" (UniqueName: \"kubernetes.io/projected/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-kube-api-access-4jsw9\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.852159 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-service-ca\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.852184 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-oauth-serving-cert\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.852251 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-console-config\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.852297 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-console-serving-cert\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.852319 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-console-oauth-config\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.853372 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-service-ca\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.853403 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-oauth-serving-cert\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.852344 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-trusted-ca-bundle\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.856236 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-console-config\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.856388 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-trusted-ca-bundle\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.861677 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-console-serving-cert\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.864196 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-console-oauth-config\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.872066 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jsw9\" (UniqueName: \"kubernetes.io/projected/74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06-kube-api-access-4jsw9\") pod \"console-797fd58d79-l2gb5\" (UID: \"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06\") " pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.924020 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.955026 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bad76a8b-0b0d-4a6e-870d-14f138beb4fb-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-bnx22\" (UID: \"bad76a8b-0b0d-4a6e-870d-14f138beb4fb\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-bnx22" Oct 01 12:48:08 crc kubenswrapper[4913]: I1001 12:48:08.959540 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bad76a8b-0b0d-4a6e-870d-14f138beb4fb-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-bnx22\" (UID: \"bad76a8b-0b0d-4a6e-870d-14f138beb4fb\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-bnx22" Oct 01 12:48:09 crc kubenswrapper[4913]: I1001 12:48:09.056257 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6565ecd-6027-4555-888c-da3a16c20260-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-nh8ft\" (UID: \"e6565ecd-6027-4555-888c-da3a16c20260\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" Oct 01 12:48:09 crc kubenswrapper[4913]: I1001 12:48:09.064980 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6565ecd-6027-4555-888c-da3a16c20260-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-nh8ft\" (UID: \"e6565ecd-6027-4555-888c-da3a16c20260\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" Oct 01 12:48:09 crc kubenswrapper[4913]: I1001 12:48:09.065977 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-r6h59"] Oct 01 12:48:09 crc kubenswrapper[4913]: I1001 12:48:09.198554 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-797fd58d79-l2gb5"] Oct 01 12:48:09 crc kubenswrapper[4913]: W1001 12:48:09.200763 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74bf44db_9b8b_4fe7_bfa7_b5b37a5cee06.slice/crio-ee533038662763e6a8f07f674a3cd271189623bc852fcb3e0c97073d3002ea5c WatchSource:0}: Error finding container ee533038662763e6a8f07f674a3cd271189623bc852fcb3e0c97073d3002ea5c: Status 404 returned error can't find the container with id ee533038662763e6a8f07f674a3cd271189623bc852fcb3e0c97073d3002ea5c Oct 01 12:48:09 crc kubenswrapper[4913]: I1001 12:48:09.225135 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-bnx22" Oct 01 12:48:09 crc kubenswrapper[4913]: I1001 12:48:09.243294 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-r6h59" event={"ID":"86077535-3ad3-414d-ad9d-5b0107ec2cf0","Type":"ContainerStarted","Data":"4a566dc6e9472f1ae6404878ddb8249aded457f0b110c3cda2d8591f6605e6ab"} Oct 01 12:48:09 crc kubenswrapper[4913]: I1001 12:48:09.244706 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-trqgg" event={"ID":"0ca9e5b5-947e-426a-81cc-8ce9774da263","Type":"ContainerStarted","Data":"6b6c636b5441acaef2ab0a95c4c354ae070fcb944e75be879ac1c0a550366730"} Oct 01 12:48:09 crc kubenswrapper[4913]: I1001 12:48:09.245845 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-797fd58d79-l2gb5" event={"ID":"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06","Type":"ContainerStarted","Data":"ee533038662763e6a8f07f674a3cd271189623bc852fcb3e0c97073d3002ea5c"} Oct 01 12:48:09 crc kubenswrapper[4913]: I1001 12:48:09.323353 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" Oct 01 12:48:09 crc kubenswrapper[4913]: I1001 12:48:09.401785 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-bnx22"] Oct 01 12:48:09 crc kubenswrapper[4913]: W1001 12:48:09.417901 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbad76a8b_0b0d_4a6e_870d_14f138beb4fb.slice/crio-1fb4839a0bd8c4acb217f1ceba313eea61f08a1522bb810ee19abaad5d55b1b5 WatchSource:0}: Error finding container 1fb4839a0bd8c4acb217f1ceba313eea61f08a1522bb810ee19abaad5d55b1b5: Status 404 returned error can't find the container with id 1fb4839a0bd8c4acb217f1ceba313eea61f08a1522bb810ee19abaad5d55b1b5 Oct 01 12:48:09 crc kubenswrapper[4913]: I1001 12:48:09.518000 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft"] Oct 01 12:48:09 crc kubenswrapper[4913]: W1001 12:48:09.520520 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6565ecd_6027_4555_888c_da3a16c20260.slice/crio-f097f7ba4177b6b8d45874edba44a3f1a4ea61a4bc60a097bf3c9dc49b60b951 WatchSource:0}: Error finding container f097f7ba4177b6b8d45874edba44a3f1a4ea61a4bc60a097bf3c9dc49b60b951: Status 404 returned error can't find the container with id f097f7ba4177b6b8d45874edba44a3f1a4ea61a4bc60a097bf3c9dc49b60b951 Oct 01 12:48:10 crc kubenswrapper[4913]: I1001 12:48:10.254161 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" event={"ID":"e6565ecd-6027-4555-888c-da3a16c20260","Type":"ContainerStarted","Data":"f097f7ba4177b6b8d45874edba44a3f1a4ea61a4bc60a097bf3c9dc49b60b951"} Oct 01 12:48:10 crc kubenswrapper[4913]: I1001 12:48:10.255980 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-797fd58d79-l2gb5" event={"ID":"74bf44db-9b8b-4fe7-bfa7-b5b37a5cee06","Type":"ContainerStarted","Data":"5f5048e992ae79fc88507013de28248a68e47c122f956836ee7a95bd0f5422d3"} Oct 01 12:48:10 crc kubenswrapper[4913]: I1001 12:48:10.259938 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-bnx22" event={"ID":"bad76a8b-0b0d-4a6e-870d-14f138beb4fb","Type":"ContainerStarted","Data":"1fb4839a0bd8c4acb217f1ceba313eea61f08a1522bb810ee19abaad5d55b1b5"} Oct 01 12:48:10 crc kubenswrapper[4913]: I1001 12:48:10.285969 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-797fd58d79-l2gb5" podStartSLOduration=2.28593898 podStartE2EDuration="2.28593898s" podCreationTimestamp="2025-10-01 12:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:48:10.277176075 +0000 UTC m=+622.180651733" watchObservedRunningTime="2025-10-01 12:48:10.28593898 +0000 UTC m=+622.189414598" Oct 01 12:48:12 crc kubenswrapper[4913]: I1001 12:48:12.273482 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-r6h59" event={"ID":"86077535-3ad3-414d-ad9d-5b0107ec2cf0","Type":"ContainerStarted","Data":"413771286efcc19c09d2e4faaf5bf82932c8d819cfd20eb13cce1aa42c49f20b"} Oct 01 12:48:12 crc kubenswrapper[4913]: I1001 12:48:12.275352 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-trqgg" event={"ID":"0ca9e5b5-947e-426a-81cc-8ce9774da263","Type":"ContainerStarted","Data":"dd65b78aedd2b36580dd1cc3b3cffff7cfb574ac15f60bfe0ffd1929c7c27524"} Oct 01 12:48:12 crc kubenswrapper[4913]: I1001 12:48:12.275511 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:12 crc kubenswrapper[4913]: I1001 12:48:12.277131 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-bnx22" event={"ID":"bad76a8b-0b0d-4a6e-870d-14f138beb4fb","Type":"ContainerStarted","Data":"eb1f2ac468ebc4953a6d68762d1b9a889262c1a7410eb09b2dbb1a61fda2f7d8"} Oct 01 12:48:12 crc kubenswrapper[4913]: I1001 12:48:12.277305 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-bnx22" Oct 01 12:48:12 crc kubenswrapper[4913]: I1001 12:48:12.292982 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-trqgg" podStartSLOduration=1.336514469 podStartE2EDuration="4.292963621s" podCreationTimestamp="2025-10-01 12:48:08 +0000 UTC" firstStartedPulling="2025-10-01 12:48:08.703994787 +0000 UTC m=+620.607470365" lastFinishedPulling="2025-10-01 12:48:11.660443939 +0000 UTC m=+623.563919517" observedRunningTime="2025-10-01 12:48:12.286990125 +0000 UTC m=+624.190465733" watchObservedRunningTime="2025-10-01 12:48:12.292963621 +0000 UTC m=+624.196439209" Oct 01 12:48:12 crc kubenswrapper[4913]: I1001 12:48:12.316564 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-bnx22" podStartSLOduration=2.074845394 podStartE2EDuration="4.316538497s" podCreationTimestamp="2025-10-01 12:48:08 +0000 UTC" firstStartedPulling="2025-10-01 12:48:09.41994059 +0000 UTC m=+621.323416168" lastFinishedPulling="2025-10-01 12:48:11.661633693 +0000 UTC m=+623.565109271" observedRunningTime="2025-10-01 12:48:12.302892558 +0000 UTC m=+624.206368136" watchObservedRunningTime="2025-10-01 12:48:12.316538497 +0000 UTC m=+624.220014075" Oct 01 12:48:13 crc kubenswrapper[4913]: I1001 12:48:13.286789 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" event={"ID":"e6565ecd-6027-4555-888c-da3a16c20260","Type":"ContainerStarted","Data":"0acea78cdb87771bd495e23aafec23359f6312d8e2a8edbde2922ff330e88edc"} Oct 01 12:48:13 crc kubenswrapper[4913]: I1001 12:48:13.304394 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nh8ft" podStartSLOduration=2.217110664 podStartE2EDuration="5.304372437s" podCreationTimestamp="2025-10-01 12:48:08 +0000 UTC" firstStartedPulling="2025-10-01 12:48:09.522440643 +0000 UTC m=+621.425916211" lastFinishedPulling="2025-10-01 12:48:12.609702406 +0000 UTC m=+624.513177984" observedRunningTime="2025-10-01 12:48:13.303784331 +0000 UTC m=+625.207259989" watchObservedRunningTime="2025-10-01 12:48:13.304372437 +0000 UTC m=+625.207848015" Oct 01 12:48:15 crc kubenswrapper[4913]: I1001 12:48:15.301726 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-r6h59" event={"ID":"86077535-3ad3-414d-ad9d-5b0107ec2cf0","Type":"ContainerStarted","Data":"8439f8cb7c0d9865ea22d9a32f433deae758f0a9e2aaff87eeb7760f38a626b6"} Oct 01 12:48:15 crc kubenswrapper[4913]: I1001 12:48:15.324572 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-r6h59" podStartSLOduration=2.251020697 podStartE2EDuration="7.324550525s" podCreationTimestamp="2025-10-01 12:48:08 +0000 UTC" firstStartedPulling="2025-10-01 12:48:09.079777554 +0000 UTC m=+620.983253152" lastFinishedPulling="2025-10-01 12:48:14.153307402 +0000 UTC m=+626.056782980" observedRunningTime="2025-10-01 12:48:15.324290438 +0000 UTC m=+627.227766036" watchObservedRunningTime="2025-10-01 12:48:15.324550525 +0000 UTC m=+627.228026113" Oct 01 12:48:18 crc kubenswrapper[4913]: I1001 12:48:18.693655 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-trqgg" Oct 01 12:48:18 crc kubenswrapper[4913]: I1001 12:48:18.925309 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:18 crc kubenswrapper[4913]: I1001 12:48:18.925452 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:18 crc kubenswrapper[4913]: I1001 12:48:18.933569 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:19 crc kubenswrapper[4913]: I1001 12:48:19.347691 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-797fd58d79-l2gb5" Oct 01 12:48:19 crc kubenswrapper[4913]: I1001 12:48:19.409260 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-97mb9"] Oct 01 12:48:29 crc kubenswrapper[4913]: I1001 12:48:29.232982 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-bnx22" Oct 01 12:48:44 crc kubenswrapper[4913]: I1001 12:48:44.476782 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-97mb9" podUID="4d2bd20a-3d8d-4073-aca4-ceca547c186f" containerName="console" containerID="cri-o://64f6a126c69e13d68237720011f9ea9198b3ff8b559d4d6eff1d672c54f78ecd" gracePeriod=15 Oct 01 12:48:44 crc kubenswrapper[4913]: I1001 12:48:44.876502 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-97mb9_4d2bd20a-3d8d-4073-aca4-ceca547c186f/console/0.log" Oct 01 12:48:44 crc kubenswrapper[4913]: I1001 12:48:44.876568 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.051587 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-serving-cert\") pod \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.051964 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-config\") pod \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.052001 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-oauth-serving-cert\") pod \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.052037 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-service-ca\") pod \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.052059 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp965\" (UniqueName: \"kubernetes.io/projected/4d2bd20a-3d8d-4073-aca4-ceca547c186f-kube-api-access-gp965\") pod \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.052082 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-oauth-config\") pod \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.052107 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-trusted-ca-bundle\") pod \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\" (UID: \"4d2bd20a-3d8d-4073-aca4-ceca547c186f\") " Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.052963 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4d2bd20a-3d8d-4073-aca4-ceca547c186f" (UID: "4d2bd20a-3d8d-4073-aca4-ceca547c186f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.052980 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4d2bd20a-3d8d-4073-aca4-ceca547c186f" (UID: "4d2bd20a-3d8d-4073-aca4-ceca547c186f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.053029 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-service-ca" (OuterVolumeSpecName: "service-ca") pod "4d2bd20a-3d8d-4073-aca4-ceca547c186f" (UID: "4d2bd20a-3d8d-4073-aca4-ceca547c186f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.053209 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-config" (OuterVolumeSpecName: "console-config") pod "4d2bd20a-3d8d-4073-aca4-ceca547c186f" (UID: "4d2bd20a-3d8d-4073-aca4-ceca547c186f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.058666 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4d2bd20a-3d8d-4073-aca4-ceca547c186f" (UID: "4d2bd20a-3d8d-4073-aca4-ceca547c186f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.064011 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4d2bd20a-3d8d-4073-aca4-ceca547c186f" (UID: "4d2bd20a-3d8d-4073-aca4-ceca547c186f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.065223 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d2bd20a-3d8d-4073-aca4-ceca547c186f-kube-api-access-gp965" (OuterVolumeSpecName: "kube-api-access-gp965") pod "4d2bd20a-3d8d-4073-aca4-ceca547c186f" (UID: "4d2bd20a-3d8d-4073-aca4-ceca547c186f"). InnerVolumeSpecName "kube-api-access-gp965". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.153455 4913 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.153501 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp965\" (UniqueName: \"kubernetes.io/projected/4d2bd20a-3d8d-4073-aca4-ceca547c186f-kube-api-access-gp965\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.153517 4913 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.153529 4913 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.153541 4913 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.153552 4913 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.153571 4913 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d2bd20a-3d8d-4073-aca4-ceca547c186f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.443667 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j"] Oct 01 12:48:45 crc kubenswrapper[4913]: E1001 12:48:45.444015 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2bd20a-3d8d-4073-aca4-ceca547c186f" containerName="console" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.444035 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2bd20a-3d8d-4073-aca4-ceca547c186f" containerName="console" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.444205 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d2bd20a-3d8d-4073-aca4-ceca547c186f" containerName="console" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.445474 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.448344 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.454985 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j"] Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.522913 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-97mb9_4d2bd20a-3d8d-4073-aca4-ceca547c186f/console/0.log" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.522968 4913 generic.go:334] "Generic (PLEG): container finished" podID="4d2bd20a-3d8d-4073-aca4-ceca547c186f" containerID="64f6a126c69e13d68237720011f9ea9198b3ff8b559d4d6eff1d672c54f78ecd" exitCode=2 Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.522999 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-97mb9" event={"ID":"4d2bd20a-3d8d-4073-aca4-ceca547c186f","Type":"ContainerDied","Data":"64f6a126c69e13d68237720011f9ea9198b3ff8b559d4d6eff1d672c54f78ecd"} Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.523057 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-97mb9" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.523073 4913 scope.go:117] "RemoveContainer" containerID="64f6a126c69e13d68237720011f9ea9198b3ff8b559d4d6eff1d672c54f78ecd" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.523057 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-97mb9" event={"ID":"4d2bd20a-3d8d-4073-aca4-ceca547c186f","Type":"ContainerDied","Data":"8005f4076373b0a70b9c49a30a9196586722949640608da74483ba5127dd970f"} Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.548260 4913 scope.go:117] "RemoveContainer" containerID="64f6a126c69e13d68237720011f9ea9198b3ff8b559d4d6eff1d672c54f78ecd" Oct 01 12:48:45 crc kubenswrapper[4913]: E1001 12:48:45.549346 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f6a126c69e13d68237720011f9ea9198b3ff8b559d4d6eff1d672c54f78ecd\": container with ID starting with 64f6a126c69e13d68237720011f9ea9198b3ff8b559d4d6eff1d672c54f78ecd not found: ID does not exist" containerID="64f6a126c69e13d68237720011f9ea9198b3ff8b559d4d6eff1d672c54f78ecd" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.549381 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f6a126c69e13d68237720011f9ea9198b3ff8b559d4d6eff1d672c54f78ecd"} err="failed to get container status \"64f6a126c69e13d68237720011f9ea9198b3ff8b559d4d6eff1d672c54f78ecd\": rpc error: code = NotFound desc = could not find container \"64f6a126c69e13d68237720011f9ea9198b3ff8b559d4d6eff1d672c54f78ecd\": container with ID starting with 64f6a126c69e13d68237720011f9ea9198b3ff8b559d4d6eff1d672c54f78ecd not found: ID does not exist" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.555377 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-97mb9"] Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.557847 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7dfff20-4135-47df-a052-3cf904ea1263-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j\" (UID: \"a7dfff20-4135-47df-a052-3cf904ea1263\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.557913 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws65w\" (UniqueName: \"kubernetes.io/projected/a7dfff20-4135-47df-a052-3cf904ea1263-kube-api-access-ws65w\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j\" (UID: \"a7dfff20-4135-47df-a052-3cf904ea1263\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.557967 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7dfff20-4135-47df-a052-3cf904ea1263-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j\" (UID: \"a7dfff20-4135-47df-a052-3cf904ea1263\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.560412 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-97mb9"] Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.659768 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws65w\" (UniqueName: \"kubernetes.io/projected/a7dfff20-4135-47df-a052-3cf904ea1263-kube-api-access-ws65w\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j\" (UID: \"a7dfff20-4135-47df-a052-3cf904ea1263\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.659879 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7dfff20-4135-47df-a052-3cf904ea1263-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j\" (UID: \"a7dfff20-4135-47df-a052-3cf904ea1263\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.659920 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7dfff20-4135-47df-a052-3cf904ea1263-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j\" (UID: \"a7dfff20-4135-47df-a052-3cf904ea1263\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.660633 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7dfff20-4135-47df-a052-3cf904ea1263-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j\" (UID: \"a7dfff20-4135-47df-a052-3cf904ea1263\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.660832 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7dfff20-4135-47df-a052-3cf904ea1263-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j\" (UID: \"a7dfff20-4135-47df-a052-3cf904ea1263\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.692984 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws65w\" (UniqueName: \"kubernetes.io/projected/a7dfff20-4135-47df-a052-3cf904ea1263-kube-api-access-ws65w\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j\" (UID: \"a7dfff20-4135-47df-a052-3cf904ea1263\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" Oct 01 12:48:45 crc kubenswrapper[4913]: I1001 12:48:45.767851 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" Oct 01 12:48:46 crc kubenswrapper[4913]: I1001 12:48:46.011622 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j"] Oct 01 12:48:46 crc kubenswrapper[4913]: I1001 12:48:46.531321 4913 generic.go:334] "Generic (PLEG): container finished" podID="a7dfff20-4135-47df-a052-3cf904ea1263" containerID="e203ac475280dfd3b36490a959c6834f40a0776677e8f113f9e57446aa226dd0" exitCode=0 Oct 01 12:48:46 crc kubenswrapper[4913]: I1001 12:48:46.531420 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" event={"ID":"a7dfff20-4135-47df-a052-3cf904ea1263","Type":"ContainerDied","Data":"e203ac475280dfd3b36490a959c6834f40a0776677e8f113f9e57446aa226dd0"} Oct 01 12:48:46 crc kubenswrapper[4913]: I1001 12:48:46.531691 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" event={"ID":"a7dfff20-4135-47df-a052-3cf904ea1263","Type":"ContainerStarted","Data":"65b8a00ac38c720c26f99fd7c1527330944ee6e13733209a5fffec9948825fcb"} Oct 01 12:48:46 crc kubenswrapper[4913]: I1001 12:48:46.819587 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d2bd20a-3d8d-4073-aca4-ceca547c186f" path="/var/lib/kubelet/pods/4d2bd20a-3d8d-4073-aca4-ceca547c186f/volumes" Oct 01 12:48:49 crc kubenswrapper[4913]: I1001 12:48:49.582639 4913 generic.go:334] "Generic (PLEG): container finished" podID="a7dfff20-4135-47df-a052-3cf904ea1263" containerID="4e6e63a2ad58c8d14eac217c5a987e7f87d4d8adaf2c9986a7788f897adefb3e" exitCode=0 Oct 01 12:48:49 crc kubenswrapper[4913]: I1001 12:48:49.582739 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" event={"ID":"a7dfff20-4135-47df-a052-3cf904ea1263","Type":"ContainerDied","Data":"4e6e63a2ad58c8d14eac217c5a987e7f87d4d8adaf2c9986a7788f897adefb3e"} Oct 01 12:48:50 crc kubenswrapper[4913]: I1001 12:48:50.594171 4913 generic.go:334] "Generic (PLEG): container finished" podID="a7dfff20-4135-47df-a052-3cf904ea1263" containerID="c3254d34d50680a007cf634e72f32846fcf7d01dfd74da8aec1b60a216b2efc5" exitCode=0 Oct 01 12:48:50 crc kubenswrapper[4913]: I1001 12:48:50.594246 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" event={"ID":"a7dfff20-4135-47df-a052-3cf904ea1263","Type":"ContainerDied","Data":"c3254d34d50680a007cf634e72f32846fcf7d01dfd74da8aec1b60a216b2efc5"} Oct 01 12:48:51 crc kubenswrapper[4913]: I1001 12:48:51.843368 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" Oct 01 12:48:51 crc kubenswrapper[4913]: I1001 12:48:51.952155 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7dfff20-4135-47df-a052-3cf904ea1263-bundle\") pod \"a7dfff20-4135-47df-a052-3cf904ea1263\" (UID: \"a7dfff20-4135-47df-a052-3cf904ea1263\") " Oct 01 12:48:51 crc kubenswrapper[4913]: I1001 12:48:51.952529 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7dfff20-4135-47df-a052-3cf904ea1263-util\") pod \"a7dfff20-4135-47df-a052-3cf904ea1263\" (UID: \"a7dfff20-4135-47df-a052-3cf904ea1263\") " Oct 01 12:48:51 crc kubenswrapper[4913]: I1001 12:48:51.952561 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws65w\" (UniqueName: \"kubernetes.io/projected/a7dfff20-4135-47df-a052-3cf904ea1263-kube-api-access-ws65w\") pod \"a7dfff20-4135-47df-a052-3cf904ea1263\" (UID: \"a7dfff20-4135-47df-a052-3cf904ea1263\") " Oct 01 12:48:51 crc kubenswrapper[4913]: I1001 12:48:51.953753 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7dfff20-4135-47df-a052-3cf904ea1263-bundle" (OuterVolumeSpecName: "bundle") pod "a7dfff20-4135-47df-a052-3cf904ea1263" (UID: "a7dfff20-4135-47df-a052-3cf904ea1263"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:48:51 crc kubenswrapper[4913]: I1001 12:48:51.957956 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7dfff20-4135-47df-a052-3cf904ea1263-kube-api-access-ws65w" (OuterVolumeSpecName: "kube-api-access-ws65w") pod "a7dfff20-4135-47df-a052-3cf904ea1263" (UID: "a7dfff20-4135-47df-a052-3cf904ea1263"). InnerVolumeSpecName "kube-api-access-ws65w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:48:51 crc kubenswrapper[4913]: I1001 12:48:51.962341 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7dfff20-4135-47df-a052-3cf904ea1263-util" (OuterVolumeSpecName: "util") pod "a7dfff20-4135-47df-a052-3cf904ea1263" (UID: "a7dfff20-4135-47df-a052-3cf904ea1263"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:48:52 crc kubenswrapper[4913]: I1001 12:48:52.054612 4913 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7dfff20-4135-47df-a052-3cf904ea1263-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:52 crc kubenswrapper[4913]: I1001 12:48:52.054674 4913 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7dfff20-4135-47df-a052-3cf904ea1263-util\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:52 crc kubenswrapper[4913]: I1001 12:48:52.054700 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws65w\" (UniqueName: \"kubernetes.io/projected/a7dfff20-4135-47df-a052-3cf904ea1263-kube-api-access-ws65w\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:52 crc kubenswrapper[4913]: I1001 12:48:52.609697 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" event={"ID":"a7dfff20-4135-47df-a052-3cf904ea1263","Type":"ContainerDied","Data":"65b8a00ac38c720c26f99fd7c1527330944ee6e13733209a5fffec9948825fcb"} Oct 01 12:48:52 crc kubenswrapper[4913]: I1001 12:48:52.609740 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65b8a00ac38c720c26f99fd7c1527330944ee6e13733209a5fffec9948825fcb" Oct 01 12:48:52 crc kubenswrapper[4913]: I1001 12:48:52.609861 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.426420 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n"] Oct 01 12:49:00 crc kubenswrapper[4913]: E1001 12:49:00.426966 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7dfff20-4135-47df-a052-3cf904ea1263" containerName="pull" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.426984 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7dfff20-4135-47df-a052-3cf904ea1263" containerName="pull" Oct 01 12:49:00 crc kubenswrapper[4913]: E1001 12:49:00.427006 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7dfff20-4135-47df-a052-3cf904ea1263" containerName="util" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.427013 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7dfff20-4135-47df-a052-3cf904ea1263" containerName="util" Oct 01 12:49:00 crc kubenswrapper[4913]: E1001 12:49:00.427027 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7dfff20-4135-47df-a052-3cf904ea1263" containerName="extract" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.427035 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7dfff20-4135-47df-a052-3cf904ea1263" containerName="extract" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.427157 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7dfff20-4135-47df-a052-3cf904ea1263" containerName="extract" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.427671 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.429900 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.430002 4913 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.430106 4913 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-wxhz7" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.430208 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.430561 4913 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.452155 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n"] Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.565443 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9cbj\" (UniqueName: \"kubernetes.io/projected/3028f0ec-9b00-468d-919b-5ed3b066bade-kube-api-access-m9cbj\") pod \"metallb-operator-controller-manager-74c9fc44b9-z2f2n\" (UID: \"3028f0ec-9b00-468d-919b-5ed3b066bade\") " pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.565499 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3028f0ec-9b00-468d-919b-5ed3b066bade-apiservice-cert\") pod \"metallb-operator-controller-manager-74c9fc44b9-z2f2n\" (UID: \"3028f0ec-9b00-468d-919b-5ed3b066bade\") " pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.565601 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3028f0ec-9b00-468d-919b-5ed3b066bade-webhook-cert\") pod \"metallb-operator-controller-manager-74c9fc44b9-z2f2n\" (UID: \"3028f0ec-9b00-468d-919b-5ed3b066bade\") " pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.666978 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3028f0ec-9b00-468d-919b-5ed3b066bade-webhook-cert\") pod \"metallb-operator-controller-manager-74c9fc44b9-z2f2n\" (UID: \"3028f0ec-9b00-468d-919b-5ed3b066bade\") " pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.667076 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9cbj\" (UniqueName: \"kubernetes.io/projected/3028f0ec-9b00-468d-919b-5ed3b066bade-kube-api-access-m9cbj\") pod \"metallb-operator-controller-manager-74c9fc44b9-z2f2n\" (UID: \"3028f0ec-9b00-468d-919b-5ed3b066bade\") " pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.667102 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3028f0ec-9b00-468d-919b-5ed3b066bade-apiservice-cert\") pod \"metallb-operator-controller-manager-74c9fc44b9-z2f2n\" (UID: \"3028f0ec-9b00-468d-919b-5ed3b066bade\") " pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.672778 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3028f0ec-9b00-468d-919b-5ed3b066bade-webhook-cert\") pod \"metallb-operator-controller-manager-74c9fc44b9-z2f2n\" (UID: \"3028f0ec-9b00-468d-919b-5ed3b066bade\") " pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.673727 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3028f0ec-9b00-468d-919b-5ed3b066bade-apiservice-cert\") pod \"metallb-operator-controller-manager-74c9fc44b9-z2f2n\" (UID: \"3028f0ec-9b00-468d-919b-5ed3b066bade\") " pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.686703 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9cbj\" (UniqueName: \"kubernetes.io/projected/3028f0ec-9b00-468d-919b-5ed3b066bade-kube-api-access-m9cbj\") pod \"metallb-operator-controller-manager-74c9fc44b9-z2f2n\" (UID: \"3028f0ec-9b00-468d-919b-5ed3b066bade\") " pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.746041 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.782914 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw"] Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.783584 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.787933 4913 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-8j68k" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.788129 4913 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.788292 4913 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.817759 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw"] Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.975873 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7bc350db-085b-4d87-a4a8-cb77c25746f9-apiservice-cert\") pod \"metallb-operator-webhook-server-64d4d5b6f-d7bsw\" (UID: \"7bc350db-085b-4d87-a4a8-cb77c25746f9\") " pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.975969 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d6zp\" (UniqueName: \"kubernetes.io/projected/7bc350db-085b-4d87-a4a8-cb77c25746f9-kube-api-access-9d6zp\") pod \"metallb-operator-webhook-server-64d4d5b6f-d7bsw\" (UID: \"7bc350db-085b-4d87-a4a8-cb77c25746f9\") " pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" Oct 01 12:49:00 crc kubenswrapper[4913]: I1001 12:49:00.975996 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7bc350db-085b-4d87-a4a8-cb77c25746f9-webhook-cert\") pod \"metallb-operator-webhook-server-64d4d5b6f-d7bsw\" (UID: \"7bc350db-085b-4d87-a4a8-cb77c25746f9\") " pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" Oct 01 12:49:01 crc kubenswrapper[4913]: I1001 12:49:01.018915 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n"] Oct 01 12:49:01 crc kubenswrapper[4913]: I1001 12:49:01.077314 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d6zp\" (UniqueName: \"kubernetes.io/projected/7bc350db-085b-4d87-a4a8-cb77c25746f9-kube-api-access-9d6zp\") pod \"metallb-operator-webhook-server-64d4d5b6f-d7bsw\" (UID: \"7bc350db-085b-4d87-a4a8-cb77c25746f9\") " pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" Oct 01 12:49:01 crc kubenswrapper[4913]: I1001 12:49:01.077362 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7bc350db-085b-4d87-a4a8-cb77c25746f9-webhook-cert\") pod \"metallb-operator-webhook-server-64d4d5b6f-d7bsw\" (UID: \"7bc350db-085b-4d87-a4a8-cb77c25746f9\") " pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" Oct 01 12:49:01 crc kubenswrapper[4913]: I1001 12:49:01.077390 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7bc350db-085b-4d87-a4a8-cb77c25746f9-apiservice-cert\") pod \"metallb-operator-webhook-server-64d4d5b6f-d7bsw\" (UID: \"7bc350db-085b-4d87-a4a8-cb77c25746f9\") " pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" Oct 01 12:49:01 crc kubenswrapper[4913]: I1001 12:49:01.082817 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7bc350db-085b-4d87-a4a8-cb77c25746f9-apiservice-cert\") pod \"metallb-operator-webhook-server-64d4d5b6f-d7bsw\" (UID: \"7bc350db-085b-4d87-a4a8-cb77c25746f9\") " pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" Oct 01 12:49:01 crc kubenswrapper[4913]: I1001 12:49:01.082879 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7bc350db-085b-4d87-a4a8-cb77c25746f9-webhook-cert\") pod \"metallb-operator-webhook-server-64d4d5b6f-d7bsw\" (UID: \"7bc350db-085b-4d87-a4a8-cb77c25746f9\") " pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" Oct 01 12:49:01 crc kubenswrapper[4913]: I1001 12:49:01.097046 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d6zp\" (UniqueName: \"kubernetes.io/projected/7bc350db-085b-4d87-a4a8-cb77c25746f9-kube-api-access-9d6zp\") pod \"metallb-operator-webhook-server-64d4d5b6f-d7bsw\" (UID: \"7bc350db-085b-4d87-a4a8-cb77c25746f9\") " pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" Oct 01 12:49:01 crc kubenswrapper[4913]: I1001 12:49:01.154975 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" Oct 01 12:49:01 crc kubenswrapper[4913]: I1001 12:49:01.440307 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw"] Oct 01 12:49:01 crc kubenswrapper[4913]: W1001 12:49:01.447906 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bc350db_085b_4d87_a4a8_cb77c25746f9.slice/crio-5412a3cf487838340a0a137c729bfa187f9401645ed45dc746fd7a7aa14dc1f5 WatchSource:0}: Error finding container 5412a3cf487838340a0a137c729bfa187f9401645ed45dc746fd7a7aa14dc1f5: Status 404 returned error can't find the container with id 5412a3cf487838340a0a137c729bfa187f9401645ed45dc746fd7a7aa14dc1f5 Oct 01 12:49:01 crc kubenswrapper[4913]: I1001 12:49:01.670502 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" event={"ID":"7bc350db-085b-4d87-a4a8-cb77c25746f9","Type":"ContainerStarted","Data":"5412a3cf487838340a0a137c729bfa187f9401645ed45dc746fd7a7aa14dc1f5"} Oct 01 12:49:01 crc kubenswrapper[4913]: I1001 12:49:01.672493 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" event={"ID":"3028f0ec-9b00-468d-919b-5ed3b066bade","Type":"ContainerStarted","Data":"dae46dc146c8e91ccb5344013c269d7efe1ca58a8099ceecb4ad6abfd2a2e583"} Oct 01 12:49:14 crc kubenswrapper[4913]: I1001 12:49:14.756407 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" event={"ID":"7bc350db-085b-4d87-a4a8-cb77c25746f9","Type":"ContainerStarted","Data":"4c9062dd77f305d10918ab7479b560d30e98b2cf52ca71fe259933a450091e1c"} Oct 01 12:49:14 crc kubenswrapper[4913]: I1001 12:49:14.757000 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" Oct 01 12:49:14 crc kubenswrapper[4913]: I1001 12:49:14.758741 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" event={"ID":"3028f0ec-9b00-468d-919b-5ed3b066bade","Type":"ContainerStarted","Data":"ad84a995ac4654a5e5246c52a8a8cd4b75efd2d945c655b4c541e76ada2c5a4a"} Oct 01 12:49:14 crc kubenswrapper[4913]: I1001 12:49:14.759166 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" Oct 01 12:49:14 crc kubenswrapper[4913]: I1001 12:49:14.781354 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" podStartSLOduration=2.589038037 podStartE2EDuration="14.781338598s" podCreationTimestamp="2025-10-01 12:49:00 +0000 UTC" firstStartedPulling="2025-10-01 12:49:01.451787509 +0000 UTC m=+673.355263087" lastFinishedPulling="2025-10-01 12:49:13.64408807 +0000 UTC m=+685.547563648" observedRunningTime="2025-10-01 12:49:14.777524412 +0000 UTC m=+686.681000000" watchObservedRunningTime="2025-10-01 12:49:14.781338598 +0000 UTC m=+686.684814176" Oct 01 12:49:14 crc kubenswrapper[4913]: I1001 12:49:14.803694 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" podStartSLOduration=2.202207513 podStartE2EDuration="14.803674399s" podCreationTimestamp="2025-10-01 12:49:00 +0000 UTC" firstStartedPulling="2025-10-01 12:49:01.028629094 +0000 UTC m=+672.932104672" lastFinishedPulling="2025-10-01 12:49:13.63009598 +0000 UTC m=+685.533571558" observedRunningTime="2025-10-01 12:49:14.799170714 +0000 UTC m=+686.702646302" watchObservedRunningTime="2025-10-01 12:49:14.803674399 +0000 UTC m=+686.707149977" Oct 01 12:49:31 crc kubenswrapper[4913]: I1001 12:49:31.161194 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-64d4d5b6f-d7bsw" Oct 01 12:49:40 crc kubenswrapper[4913]: I1001 12:49:40.084011 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:49:40 crc kubenswrapper[4913]: I1001 12:49:40.084625 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:49:50 crc kubenswrapper[4913]: I1001 12:49:50.750325 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-74c9fc44b9-z2f2n" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.617024 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-ksptf"] Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.619949 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.622517 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.622546 4913 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-6mpsx" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.630914 4913 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.644302 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz"] Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.645344 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.647621 4913 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.667421 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz"] Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.717841 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74154ce8-d469-4b0a-98f2-23206e3939a4-metrics-certs\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.717887 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/74154ce8-d469-4b0a-98f2-23206e3939a4-frr-conf\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.717919 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/74154ce8-d469-4b0a-98f2-23206e3939a4-metrics\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.717954 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/74154ce8-d469-4b0a-98f2-23206e3939a4-reloader\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.717971 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5sd5\" (UniqueName: \"kubernetes.io/projected/74154ce8-d469-4b0a-98f2-23206e3939a4-kube-api-access-f5sd5\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.718101 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/74154ce8-d469-4b0a-98f2-23206e3939a4-frr-sockets\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.718201 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/74154ce8-d469-4b0a-98f2-23206e3939a4-frr-startup\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.718985 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-nrghj"] Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.720025 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-nrghj" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.721462 4913 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.721641 4913 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.721968 4913 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9tptt" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.722042 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.731197 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-pzf47"] Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.732451 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-pzf47" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.734131 4913 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.742982 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-pzf47"] Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.819313 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwd4m\" (UniqueName: \"kubernetes.io/projected/cf61dc53-493d-4c23-b13a-a4a496d1014d-kube-api-access-dwd4m\") pod \"frr-k8s-webhook-server-5478bdb765-kjqpz\" (UID: \"cf61dc53-493d-4c23-b13a-a4a496d1014d\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.819360 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/74154ce8-d469-4b0a-98f2-23206e3939a4-reloader\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.819376 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5sd5\" (UniqueName: \"kubernetes.io/projected/74154ce8-d469-4b0a-98f2-23206e3939a4-kube-api-access-f5sd5\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.819408 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/74154ce8-d469-4b0a-98f2-23206e3939a4-frr-sockets\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.819436 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/74154ce8-d469-4b0a-98f2-23206e3939a4-frr-startup\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.819459 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-metallb-excludel2\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.819481 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74154ce8-d469-4b0a-98f2-23206e3939a4-metrics-certs\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.819497 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlrmr\" (UniqueName: \"kubernetes.io/projected/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-kube-api-access-nlrmr\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.819516 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-memberlist\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.819532 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-metrics-certs\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.819547 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/74154ce8-d469-4b0a-98f2-23206e3939a4-frr-conf\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.819569 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf61dc53-493d-4c23-b13a-a4a496d1014d-cert\") pod \"frr-k8s-webhook-server-5478bdb765-kjqpz\" (UID: \"cf61dc53-493d-4c23-b13a-a4a496d1014d\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.819586 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/74154ce8-d469-4b0a-98f2-23206e3939a4-metrics\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.819986 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/74154ce8-d469-4b0a-98f2-23206e3939a4-metrics\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.820170 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/74154ce8-d469-4b0a-98f2-23206e3939a4-reloader\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.820593 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/74154ce8-d469-4b0a-98f2-23206e3939a4-frr-sockets\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.820922 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/74154ce8-d469-4b0a-98f2-23206e3939a4-frr-conf\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.821480 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/74154ce8-d469-4b0a-98f2-23206e3939a4-frr-startup\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.825834 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74154ce8-d469-4b0a-98f2-23206e3939a4-metrics-certs\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.837335 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5sd5\" (UniqueName: \"kubernetes.io/projected/74154ce8-d469-4b0a-98f2-23206e3939a4-kube-api-access-f5sd5\") pod \"frr-k8s-ksptf\" (UID: \"74154ce8-d469-4b0a-98f2-23206e3939a4\") " pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.920947 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf61dc53-493d-4c23-b13a-a4a496d1014d-cert\") pod \"frr-k8s-webhook-server-5478bdb765-kjqpz\" (UID: \"cf61dc53-493d-4c23-b13a-a4a496d1014d\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.921006 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwd4m\" (UniqueName: \"kubernetes.io/projected/cf61dc53-493d-4c23-b13a-a4a496d1014d-kube-api-access-dwd4m\") pod \"frr-k8s-webhook-server-5478bdb765-kjqpz\" (UID: \"cf61dc53-493d-4c23-b13a-a4a496d1014d\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.921043 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k2sp\" (UniqueName: \"kubernetes.io/projected/4f13db7d-cdb3-47bf-84db-d78e4f620eb9-kube-api-access-5k2sp\") pod \"controller-5d688f5ffc-pzf47\" (UID: \"4f13db7d-cdb3-47bf-84db-d78e4f620eb9\") " pod="metallb-system/controller-5d688f5ffc-pzf47" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.921075 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f13db7d-cdb3-47bf-84db-d78e4f620eb9-metrics-certs\") pod \"controller-5d688f5ffc-pzf47\" (UID: \"4f13db7d-cdb3-47bf-84db-d78e4f620eb9\") " pod="metallb-system/controller-5d688f5ffc-pzf47" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.921094 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-metallb-excludel2\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.921117 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlrmr\" (UniqueName: \"kubernetes.io/projected/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-kube-api-access-nlrmr\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.921133 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-memberlist\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.921149 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-metrics-certs\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.921166 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f13db7d-cdb3-47bf-84db-d78e4f620eb9-cert\") pod \"controller-5d688f5ffc-pzf47\" (UID: \"4f13db7d-cdb3-47bf-84db-d78e4f620eb9\") " pod="metallb-system/controller-5d688f5ffc-pzf47" Oct 01 12:49:51 crc kubenswrapper[4913]: E1001 12:49:51.921196 4913 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 01 12:49:51 crc kubenswrapper[4913]: E1001 12:49:51.921304 4913 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 12:49:51 crc kubenswrapper[4913]: E1001 12:49:51.921323 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf61dc53-493d-4c23-b13a-a4a496d1014d-cert podName:cf61dc53-493d-4c23-b13a-a4a496d1014d nodeName:}" failed. No retries permitted until 2025-10-01 12:49:52.421295017 +0000 UTC m=+724.324770675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf61dc53-493d-4c23-b13a-a4a496d1014d-cert") pod "frr-k8s-webhook-server-5478bdb765-kjqpz" (UID: "cf61dc53-493d-4c23-b13a-a4a496d1014d") : secret "frr-k8s-webhook-server-cert" not found Oct 01 12:49:51 crc kubenswrapper[4913]: E1001 12:49:51.921307 4913 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 01 12:49:51 crc kubenswrapper[4913]: E1001 12:49:51.921374 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-memberlist podName:cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7 nodeName:}" failed. No retries permitted until 2025-10-01 12:49:52.421350628 +0000 UTC m=+724.324826226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-memberlist") pod "speaker-nrghj" (UID: "cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7") : secret "metallb-memberlist" not found Oct 01 12:49:51 crc kubenswrapper[4913]: E1001 12:49:51.921432 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-metrics-certs podName:cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7 nodeName:}" failed. No retries permitted until 2025-10-01 12:49:52.42141606 +0000 UTC m=+724.324891638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-metrics-certs") pod "speaker-nrghj" (UID: "cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7") : secret "speaker-certs-secret" not found Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.921893 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-metallb-excludel2\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.937214 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ksptf" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.951728 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwd4m\" (UniqueName: \"kubernetes.io/projected/cf61dc53-493d-4c23-b13a-a4a496d1014d-kube-api-access-dwd4m\") pod \"frr-k8s-webhook-server-5478bdb765-kjqpz\" (UID: \"cf61dc53-493d-4c23-b13a-a4a496d1014d\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz" Oct 01 12:49:51 crc kubenswrapper[4913]: I1001 12:49:51.953044 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlrmr\" (UniqueName: \"kubernetes.io/projected/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-kube-api-access-nlrmr\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.022622 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k2sp\" (UniqueName: \"kubernetes.io/projected/4f13db7d-cdb3-47bf-84db-d78e4f620eb9-kube-api-access-5k2sp\") pod \"controller-5d688f5ffc-pzf47\" (UID: \"4f13db7d-cdb3-47bf-84db-d78e4f620eb9\") " pod="metallb-system/controller-5d688f5ffc-pzf47" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.022680 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f13db7d-cdb3-47bf-84db-d78e4f620eb9-metrics-certs\") pod \"controller-5d688f5ffc-pzf47\" (UID: \"4f13db7d-cdb3-47bf-84db-d78e4f620eb9\") " pod="metallb-system/controller-5d688f5ffc-pzf47" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.022727 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f13db7d-cdb3-47bf-84db-d78e4f620eb9-cert\") pod \"controller-5d688f5ffc-pzf47\" (UID: \"4f13db7d-cdb3-47bf-84db-d78e4f620eb9\") " pod="metallb-system/controller-5d688f5ffc-pzf47" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.025168 4913 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.032934 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f13db7d-cdb3-47bf-84db-d78e4f620eb9-metrics-certs\") pod \"controller-5d688f5ffc-pzf47\" (UID: \"4f13db7d-cdb3-47bf-84db-d78e4f620eb9\") " pod="metallb-system/controller-5d688f5ffc-pzf47" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.037841 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f13db7d-cdb3-47bf-84db-d78e4f620eb9-cert\") pod \"controller-5d688f5ffc-pzf47\" (UID: \"4f13db7d-cdb3-47bf-84db-d78e4f620eb9\") " pod="metallb-system/controller-5d688f5ffc-pzf47" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.051140 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k2sp\" (UniqueName: \"kubernetes.io/projected/4f13db7d-cdb3-47bf-84db-d78e4f620eb9-kube-api-access-5k2sp\") pod \"controller-5d688f5ffc-pzf47\" (UID: \"4f13db7d-cdb3-47bf-84db-d78e4f620eb9\") " pod="metallb-system/controller-5d688f5ffc-pzf47" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.344007 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-pzf47" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.427852 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-memberlist\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.428199 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-metrics-certs\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.428255 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf61dc53-493d-4c23-b13a-a4a496d1014d-cert\") pod \"frr-k8s-webhook-server-5478bdb765-kjqpz\" (UID: \"cf61dc53-493d-4c23-b13a-a4a496d1014d\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz" Oct 01 12:49:52 crc kubenswrapper[4913]: E1001 12:49:52.428043 4913 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 12:49:52 crc kubenswrapper[4913]: E1001 12:49:52.429140 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-memberlist podName:cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7 nodeName:}" failed. No retries permitted until 2025-10-01 12:49:53.429115627 +0000 UTC m=+725.332591275 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-memberlist") pod "speaker-nrghj" (UID: "cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7") : secret "metallb-memberlist" not found Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.432773 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-metrics-certs\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.432996 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf61dc53-493d-4c23-b13a-a4a496d1014d-cert\") pod \"frr-k8s-webhook-server-5478bdb765-kjqpz\" (UID: \"cf61dc53-493d-4c23-b13a-a4a496d1014d\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.545336 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-pzf47"] Oct 01 12:49:52 crc kubenswrapper[4913]: W1001 12:49:52.546738 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f13db7d_cdb3_47bf_84db_d78e4f620eb9.slice/crio-e146ab597ec098d22ebe0cd5440465431b37db87cbc8e267d3dc7f69a05c2319 WatchSource:0}: Error finding container e146ab597ec098d22ebe0cd5440465431b37db87cbc8e267d3dc7f69a05c2319: Status 404 returned error can't find the container with id e146ab597ec098d22ebe0cd5440465431b37db87cbc8e267d3dc7f69a05c2319 Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.560444 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.841852 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz"] Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.953761 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-pzf47" event={"ID":"4f13db7d-cdb3-47bf-84db-d78e4f620eb9","Type":"ContainerStarted","Data":"4664bf9bec8aa8233d11cde687135baa7f252a3bd7bb51da2b62e0ee83b75470"} Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.953801 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-pzf47" event={"ID":"4f13db7d-cdb3-47bf-84db-d78e4f620eb9","Type":"ContainerStarted","Data":"c80c5ed0bb1dd949d0227f18fcfb676ccbc90d75786737030870ec06e20d904f"} Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.953815 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-pzf47" event={"ID":"4f13db7d-cdb3-47bf-84db-d78e4f620eb9","Type":"ContainerStarted","Data":"e146ab597ec098d22ebe0cd5440465431b37db87cbc8e267d3dc7f69a05c2319"} Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.953895 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-pzf47" Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.955028 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz" event={"ID":"cf61dc53-493d-4c23-b13a-a4a496d1014d","Type":"ContainerStarted","Data":"04d13c3e047e1796a61cd2f8a7b61c444de835841b1517850804789d3cbdb4aa"} Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.956440 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ksptf" event={"ID":"74154ce8-d469-4b0a-98f2-23206e3939a4","Type":"ContainerStarted","Data":"845862cc88c8b7f36e5941a089ffa14fbafa0996ec349d2e7ee8637d16835f3b"} Oct 01 12:49:52 crc kubenswrapper[4913]: I1001 12:49:52.970950 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-pzf47" podStartSLOduration=1.97092411 podStartE2EDuration="1.97092411s" podCreationTimestamp="2025-10-01 12:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:49:52.967521975 +0000 UTC m=+724.870997563" watchObservedRunningTime="2025-10-01 12:49:52.97092411 +0000 UTC m=+724.874399718" Oct 01 12:49:53 crc kubenswrapper[4913]: I1001 12:49:53.446476 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-memberlist\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:53 crc kubenswrapper[4913]: I1001 12:49:53.455892 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7-memberlist\") pod \"speaker-nrghj\" (UID: \"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7\") " pod="metallb-system/speaker-nrghj" Oct 01 12:49:53 crc kubenswrapper[4913]: I1001 12:49:53.531784 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-nrghj" Oct 01 12:49:53 crc kubenswrapper[4913]: W1001 12:49:53.560489 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc9b5c10_d3bd_4f45_abc2_d0462bbf93f7.slice/crio-2bd7f34e88a24739127b88a95585d9748f19e1ee0b8a09fe81816bc79b988e40 WatchSource:0}: Error finding container 2bd7f34e88a24739127b88a95585d9748f19e1ee0b8a09fe81816bc79b988e40: Status 404 returned error can't find the container with id 2bd7f34e88a24739127b88a95585d9748f19e1ee0b8a09fe81816bc79b988e40 Oct 01 12:49:53 crc kubenswrapper[4913]: I1001 12:49:53.973882 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nrghj" event={"ID":"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7","Type":"ContainerStarted","Data":"6407b72f91ec053d1450ca53d9db35cd255fcdd4572abbf272e4cf22a7ba1ff9"} Oct 01 12:49:53 crc kubenswrapper[4913]: I1001 12:49:53.973920 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nrghj" event={"ID":"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7","Type":"ContainerStarted","Data":"2bd7f34e88a24739127b88a95585d9748f19e1ee0b8a09fe81816bc79b988e40"} Oct 01 12:49:54 crc kubenswrapper[4913]: I1001 12:49:54.987258 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nrghj" event={"ID":"cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7","Type":"ContainerStarted","Data":"2a7768fbcb017a682687f157be7e457c276403883a6ebe72b6c3f328e3125b7c"} Oct 01 12:49:54 crc kubenswrapper[4913]: I1001 12:49:54.987433 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-nrghj" Oct 01 12:49:55 crc kubenswrapper[4913]: I1001 12:49:55.007940 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-nrghj" podStartSLOduration=4.007868521 podStartE2EDuration="4.007868521s" podCreationTimestamp="2025-10-01 12:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:49:55.004542669 +0000 UTC m=+726.908018277" watchObservedRunningTime="2025-10-01 12:49:55.007868521 +0000 UTC m=+726.911344099" Oct 01 12:50:00 crc kubenswrapper[4913]: I1001 12:50:00.011953 4913 generic.go:334] "Generic (PLEG): container finished" podID="74154ce8-d469-4b0a-98f2-23206e3939a4" containerID="6ac4a4445b75f522345ff705778fdb3ff9b7c6c94edadfd41c0032d4ab0d6ab4" exitCode=0 Oct 01 12:50:00 crc kubenswrapper[4913]: I1001 12:50:00.012045 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ksptf" event={"ID":"74154ce8-d469-4b0a-98f2-23206e3939a4","Type":"ContainerDied","Data":"6ac4a4445b75f522345ff705778fdb3ff9b7c6c94edadfd41c0032d4ab0d6ab4"} Oct 01 12:50:00 crc kubenswrapper[4913]: I1001 12:50:00.013785 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz" event={"ID":"cf61dc53-493d-4c23-b13a-a4a496d1014d","Type":"ContainerStarted","Data":"b9473d5b1321cdd19deec7bfb689206ec40f553ec282caf824db3cd0960bb4c5"} Oct 01 12:50:00 crc kubenswrapper[4913]: I1001 12:50:00.013941 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz" Oct 01 12:50:00 crc kubenswrapper[4913]: I1001 12:50:00.046301 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz" podStartSLOduration=2.532565209 podStartE2EDuration="9.046280558s" podCreationTimestamp="2025-10-01 12:49:51 +0000 UTC" firstStartedPulling="2025-10-01 12:49:52.847202393 +0000 UTC m=+724.750677991" lastFinishedPulling="2025-10-01 12:49:59.360917752 +0000 UTC m=+731.264393340" observedRunningTime="2025-10-01 12:50:00.044111069 +0000 UTC m=+731.947586657" watchObservedRunningTime="2025-10-01 12:50:00.046280558 +0000 UTC m=+731.949756136" Oct 01 12:50:01 crc kubenswrapper[4913]: I1001 12:50:01.026659 4913 generic.go:334] "Generic (PLEG): container finished" podID="74154ce8-d469-4b0a-98f2-23206e3939a4" containerID="d0728299944595c0079b244e4536554ed57a703bb790afef98838ca2c4d4e150" exitCode=0 Oct 01 12:50:01 crc kubenswrapper[4913]: I1001 12:50:01.027517 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ksptf" event={"ID":"74154ce8-d469-4b0a-98f2-23206e3939a4","Type":"ContainerDied","Data":"d0728299944595c0079b244e4536554ed57a703bb790afef98838ca2c4d4e150"} Oct 01 12:50:02 crc kubenswrapper[4913]: I1001 12:50:02.037872 4913 generic.go:334] "Generic (PLEG): container finished" podID="74154ce8-d469-4b0a-98f2-23206e3939a4" containerID="7e87301abac2dec212064c101e935ea02542281b560d3ef142d2951d4153852d" exitCode=0 Oct 01 12:50:02 crc kubenswrapper[4913]: I1001 12:50:02.037919 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ksptf" event={"ID":"74154ce8-d469-4b0a-98f2-23206e3939a4","Type":"ContainerDied","Data":"7e87301abac2dec212064c101e935ea02542281b560d3ef142d2951d4153852d"} Oct 01 12:50:02 crc kubenswrapper[4913]: I1001 12:50:02.347458 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-pzf47" Oct 01 12:50:03 crc kubenswrapper[4913]: I1001 12:50:03.049074 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ksptf" event={"ID":"74154ce8-d469-4b0a-98f2-23206e3939a4","Type":"ContainerStarted","Data":"425937459e35057563ff68eb10a63a4c7210537e09c22da3e7318cc43e2df527"} Oct 01 12:50:03 crc kubenswrapper[4913]: I1001 12:50:03.049114 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ksptf" event={"ID":"74154ce8-d469-4b0a-98f2-23206e3939a4","Type":"ContainerStarted","Data":"1eb7096bf293e4243b520a1c9feb2fa278db666d07a58b6de4a250872ef4c639"} Oct 01 12:50:03 crc kubenswrapper[4913]: I1001 12:50:03.049124 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ksptf" event={"ID":"74154ce8-d469-4b0a-98f2-23206e3939a4","Type":"ContainerStarted","Data":"a5cef1c10cb4efe91082b43df89d77dc635ff1fd4ede772d0313db568221cd5f"} Oct 01 12:50:03 crc kubenswrapper[4913]: I1001 12:50:03.049135 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ksptf" event={"ID":"74154ce8-d469-4b0a-98f2-23206e3939a4","Type":"ContainerStarted","Data":"bdd1a0c1de761e4b87bb72122a72ad3a9b677126b9d48dad14683eddec130f33"} Oct 01 12:50:03 crc kubenswrapper[4913]: I1001 12:50:03.049143 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ksptf" event={"ID":"74154ce8-d469-4b0a-98f2-23206e3939a4","Type":"ContainerStarted","Data":"862465f9985db17220e264a84925c4139488f2f50028d768f8348a4b494b8e19"} Oct 01 12:50:03 crc kubenswrapper[4913]: I1001 12:50:03.536762 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-nrghj" Oct 01 12:50:04 crc kubenswrapper[4913]: I1001 12:50:04.059741 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ksptf" event={"ID":"74154ce8-d469-4b0a-98f2-23206e3939a4","Type":"ContainerStarted","Data":"e6bdca85e007c19a175a597eaf67b0981095469f4d95c338ad7343f907ab94bb"} Oct 01 12:50:04 crc kubenswrapper[4913]: I1001 12:50:04.060554 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-ksptf" Oct 01 12:50:04 crc kubenswrapper[4913]: I1001 12:50:04.084592 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-ksptf" podStartSLOduration=5.8827733250000005 podStartE2EDuration="13.084570015s" podCreationTimestamp="2025-10-01 12:49:51 +0000 UTC" firstStartedPulling="2025-10-01 12:49:52.144673652 +0000 UTC m=+724.048149240" lastFinishedPulling="2025-10-01 12:49:59.346470342 +0000 UTC m=+731.249945930" observedRunningTime="2025-10-01 12:50:04.081450549 +0000 UTC m=+735.984926177" watchObservedRunningTime="2025-10-01 12:50:04.084570015 +0000 UTC m=+735.988045613" Oct 01 12:50:06 crc kubenswrapper[4913]: I1001 12:50:06.938418 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-ksptf" Oct 01 12:50:06 crc kubenswrapper[4913]: I1001 12:50:06.995075 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-ksptf" Oct 01 12:50:10 crc kubenswrapper[4913]: I1001 12:50:10.083553 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:50:10 crc kubenswrapper[4913]: I1001 12:50:10.083907 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:50:10 crc kubenswrapper[4913]: I1001 12:50:10.292371 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-p6qxl"] Oct 01 12:50:10 crc kubenswrapper[4913]: I1001 12:50:10.294193 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p6qxl" Oct 01 12:50:10 crc kubenswrapper[4913]: I1001 12:50:10.296788 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 01 12:50:10 crc kubenswrapper[4913]: I1001 12:50:10.296788 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-sfkkd" Oct 01 12:50:10 crc kubenswrapper[4913]: I1001 12:50:10.300188 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 01 12:50:10 crc kubenswrapper[4913]: I1001 12:50:10.300370 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p6qxl"] Oct 01 12:50:10 crc kubenswrapper[4913]: I1001 12:50:10.311917 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hbq2\" (UniqueName: \"kubernetes.io/projected/b7d963bd-d50c-4a8b-bd2d-75220df4c964-kube-api-access-9hbq2\") pod \"openstack-operator-index-p6qxl\" (UID: \"b7d963bd-d50c-4a8b-bd2d-75220df4c964\") " pod="openstack-operators/openstack-operator-index-p6qxl" Oct 01 12:50:10 crc kubenswrapper[4913]: I1001 12:50:10.413534 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hbq2\" (UniqueName: \"kubernetes.io/projected/b7d963bd-d50c-4a8b-bd2d-75220df4c964-kube-api-access-9hbq2\") pod \"openstack-operator-index-p6qxl\" (UID: \"b7d963bd-d50c-4a8b-bd2d-75220df4c964\") " pod="openstack-operators/openstack-operator-index-p6qxl" Oct 01 12:50:10 crc kubenswrapper[4913]: I1001 12:50:10.440867 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hbq2\" (UniqueName: \"kubernetes.io/projected/b7d963bd-d50c-4a8b-bd2d-75220df4c964-kube-api-access-9hbq2\") pod \"openstack-operator-index-p6qxl\" (UID: \"b7d963bd-d50c-4a8b-bd2d-75220df4c964\") " pod="openstack-operators/openstack-operator-index-p6qxl" Oct 01 12:50:10 crc kubenswrapper[4913]: I1001 12:50:10.622416 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p6qxl" Oct 01 12:50:11 crc kubenswrapper[4913]: I1001 12:50:11.053813 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p6qxl"] Oct 01 12:50:11 crc kubenswrapper[4913]: I1001 12:50:11.109520 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p6qxl" event={"ID":"b7d963bd-d50c-4a8b-bd2d-75220df4c964","Type":"ContainerStarted","Data":"21f9fcfac1a3eefde6f8de66dea6527e23f46186b5c982dc0ec3a81d9c3a43d8"} Oct 01 12:50:12 crc kubenswrapper[4913]: I1001 12:50:12.566676 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-kjqpz" Oct 01 12:50:13 crc kubenswrapper[4913]: I1001 12:50:13.124220 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p6qxl" event={"ID":"b7d963bd-d50c-4a8b-bd2d-75220df4c964","Type":"ContainerStarted","Data":"63753bcbb2122ea1a666a48627fc3b3f60d800e9ba3e7e6e7ea03a0c004540e1"} Oct 01 12:50:13 crc kubenswrapper[4913]: I1001 12:50:13.146588 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-p6qxl" podStartSLOduration=2.089985966 podStartE2EDuration="3.146568163s" podCreationTimestamp="2025-10-01 12:50:10 +0000 UTC" firstStartedPulling="2025-10-01 12:50:11.061835888 +0000 UTC m=+742.965311466" lastFinishedPulling="2025-10-01 12:50:12.118418055 +0000 UTC m=+744.021893663" observedRunningTime="2025-10-01 12:50:13.141018119 +0000 UTC m=+745.044493747" watchObservedRunningTime="2025-10-01 12:50:13.146568163 +0000 UTC m=+745.050043771" Oct 01 12:50:15 crc kubenswrapper[4913]: I1001 12:50:15.485113 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-p6qxl"] Oct 01 12:50:15 crc kubenswrapper[4913]: I1001 12:50:15.487142 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-p6qxl" podUID="b7d963bd-d50c-4a8b-bd2d-75220df4c964" containerName="registry-server" containerID="cri-o://63753bcbb2122ea1a666a48627fc3b3f60d800e9ba3e7e6e7ea03a0c004540e1" gracePeriod=2 Oct 01 12:50:15 crc kubenswrapper[4913]: I1001 12:50:15.721987 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vwnnv"] Oct 01 12:50:15 crc kubenswrapper[4913]: I1001 12:50:15.722338 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" podUID="bb7d7281-96c5-4a7d-b57f-769fbafba858" containerName="controller-manager" containerID="cri-o://17d8224fd5e1c214f91d489f4b4d53e8755766f76af76a8b69c070345de98785" gracePeriod=30 Oct 01 12:50:15 crc kubenswrapper[4913]: I1001 12:50:15.810516 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph"] Oct 01 12:50:15 crc kubenswrapper[4913]: I1001 12:50:15.811040 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" podUID="db3cf64b-1ff7-4792-94fa-4400e85bc1d6" containerName="route-controller-manager" containerID="cri-o://6de81adf1d32a551c07256d9bb6afc1dfe17c2edc6aea8303f53f43fbc2786a6" gracePeriod=30 Oct 01 12:50:15 crc kubenswrapper[4913]: I1001 12:50:15.969373 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p6qxl" Oct 01 12:50:15 crc kubenswrapper[4913]: I1001 12:50:15.985409 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hbq2\" (UniqueName: \"kubernetes.io/projected/b7d963bd-d50c-4a8b-bd2d-75220df4c964-kube-api-access-9hbq2\") pod \"b7d963bd-d50c-4a8b-bd2d-75220df4c964\" (UID: \"b7d963bd-d50c-4a8b-bd2d-75220df4c964\") " Oct 01 12:50:15 crc kubenswrapper[4913]: I1001 12:50:15.992674 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d963bd-d50c-4a8b-bd2d-75220df4c964-kube-api-access-9hbq2" (OuterVolumeSpecName: "kube-api-access-9hbq2") pod "b7d963bd-d50c-4a8b-bd2d-75220df4c964" (UID: "b7d963bd-d50c-4a8b-bd2d-75220df4c964"). InnerVolumeSpecName "kube-api-access-9hbq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.089013 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hbq2\" (UniqueName: \"kubernetes.io/projected/b7d963bd-d50c-4a8b-bd2d-75220df4c964-kube-api-access-9hbq2\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.091419 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xptgj"] Oct 01 12:50:16 crc kubenswrapper[4913]: E1001 12:50:16.091633 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d963bd-d50c-4a8b-bd2d-75220df4c964" containerName="registry-server" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.091645 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d963bd-d50c-4a8b-bd2d-75220df4c964" containerName="registry-server" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.091760 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d963bd-d50c-4a8b-bd2d-75220df4c964" containerName="registry-server" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.092116 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xptgj" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.104137 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xptgj"] Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.140953 4913 generic.go:334] "Generic (PLEG): container finished" podID="bb7d7281-96c5-4a7d-b57f-769fbafba858" containerID="17d8224fd5e1c214f91d489f4b4d53e8755766f76af76a8b69c070345de98785" exitCode=0 Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.141047 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" event={"ID":"bb7d7281-96c5-4a7d-b57f-769fbafba858","Type":"ContainerDied","Data":"17d8224fd5e1c214f91d489f4b4d53e8755766f76af76a8b69c070345de98785"} Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.142909 4913 generic.go:334] "Generic (PLEG): container finished" podID="db3cf64b-1ff7-4792-94fa-4400e85bc1d6" containerID="6de81adf1d32a551c07256d9bb6afc1dfe17c2edc6aea8303f53f43fbc2786a6" exitCode=0 Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.143005 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" event={"ID":"db3cf64b-1ff7-4792-94fa-4400e85bc1d6","Type":"ContainerDied","Data":"6de81adf1d32a551c07256d9bb6afc1dfe17c2edc6aea8303f53f43fbc2786a6"} Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.143842 4913 generic.go:334] "Generic (PLEG): container finished" podID="b7d963bd-d50c-4a8b-bd2d-75220df4c964" containerID="63753bcbb2122ea1a666a48627fc3b3f60d800e9ba3e7e6e7ea03a0c004540e1" exitCode=0 Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.143866 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p6qxl" event={"ID":"b7d963bd-d50c-4a8b-bd2d-75220df4c964","Type":"ContainerDied","Data":"63753bcbb2122ea1a666a48627fc3b3f60d800e9ba3e7e6e7ea03a0c004540e1"} Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.143881 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p6qxl" event={"ID":"b7d963bd-d50c-4a8b-bd2d-75220df4c964","Type":"ContainerDied","Data":"21f9fcfac1a3eefde6f8de66dea6527e23f46186b5c982dc0ec3a81d9c3a43d8"} Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.143897 4913 scope.go:117] "RemoveContainer" containerID="63753bcbb2122ea1a666a48627fc3b3f60d800e9ba3e7e6e7ea03a0c004540e1" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.143988 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p6qxl" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.156183 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.163814 4913 scope.go:117] "RemoveContainer" containerID="63753bcbb2122ea1a666a48627fc3b3f60d800e9ba3e7e6e7ea03a0c004540e1" Oct 01 12:50:16 crc kubenswrapper[4913]: E1001 12:50:16.164354 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63753bcbb2122ea1a666a48627fc3b3f60d800e9ba3e7e6e7ea03a0c004540e1\": container with ID starting with 63753bcbb2122ea1a666a48627fc3b3f60d800e9ba3e7e6e7ea03a0c004540e1 not found: ID does not exist" containerID="63753bcbb2122ea1a666a48627fc3b3f60d800e9ba3e7e6e7ea03a0c004540e1" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.164392 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63753bcbb2122ea1a666a48627fc3b3f60d800e9ba3e7e6e7ea03a0c004540e1"} err="failed to get container status \"63753bcbb2122ea1a666a48627fc3b3f60d800e9ba3e7e6e7ea03a0c004540e1\": rpc error: code = NotFound desc = could not find container \"63753bcbb2122ea1a666a48627fc3b3f60d800e9ba3e7e6e7ea03a0c004540e1\": container with ID starting with 63753bcbb2122ea1a666a48627fc3b3f60d800e9ba3e7e6e7ea03a0c004540e1 not found: ID does not exist" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.165951 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.223521 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvzqn\" (UniqueName: \"kubernetes.io/projected/bb7d7281-96c5-4a7d-b57f-769fbafba858-kube-api-access-jvzqn\") pod \"bb7d7281-96c5-4a7d-b57f-769fbafba858\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.223563 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb7d7281-96c5-4a7d-b57f-769fbafba858-serving-cert\") pod \"bb7d7281-96c5-4a7d-b57f-769fbafba858\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.223616 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-config\") pod \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.223640 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-proxy-ca-bundles\") pod \"bb7d7281-96c5-4a7d-b57f-769fbafba858\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.223672 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-config\") pod \"bb7d7281-96c5-4a7d-b57f-769fbafba858\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.223689 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-client-ca\") pod \"bb7d7281-96c5-4a7d-b57f-769fbafba858\" (UID: \"bb7d7281-96c5-4a7d-b57f-769fbafba858\") " Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.223710 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-serving-cert\") pod \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.223726 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-client-ca\") pod \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.223748 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrcmg\" (UniqueName: \"kubernetes.io/projected/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-kube-api-access-xrcmg\") pod \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\" (UID: \"db3cf64b-1ff7-4792-94fa-4400e85bc1d6\") " Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.223871 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg6wj\" (UniqueName: \"kubernetes.io/projected/5262687f-5e25-4632-8122-9e15fb72e8d9-kube-api-access-gg6wj\") pod \"openstack-operator-index-xptgj\" (UID: \"5262687f-5e25-4632-8122-9e15fb72e8d9\") " pod="openstack-operators/openstack-operator-index-xptgj" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.224637 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-client-ca" (OuterVolumeSpecName: "client-ca") pod "bb7d7281-96c5-4a7d-b57f-769fbafba858" (UID: "bb7d7281-96c5-4a7d-b57f-769fbafba858"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.225522 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bb7d7281-96c5-4a7d-b57f-769fbafba858" (UID: "bb7d7281-96c5-4a7d-b57f-769fbafba858"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.226119 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-client-ca" (OuterVolumeSpecName: "client-ca") pod "db3cf64b-1ff7-4792-94fa-4400e85bc1d6" (UID: "db3cf64b-1ff7-4792-94fa-4400e85bc1d6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.227920 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-config" (OuterVolumeSpecName: "config") pod "db3cf64b-1ff7-4792-94fa-4400e85bc1d6" (UID: "db3cf64b-1ff7-4792-94fa-4400e85bc1d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.230492 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-config" (OuterVolumeSpecName: "config") pod "bb7d7281-96c5-4a7d-b57f-769fbafba858" (UID: "bb7d7281-96c5-4a7d-b57f-769fbafba858"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.245821 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "db3cf64b-1ff7-4792-94fa-4400e85bc1d6" (UID: "db3cf64b-1ff7-4792-94fa-4400e85bc1d6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.248935 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb7d7281-96c5-4a7d-b57f-769fbafba858-kube-api-access-jvzqn" (OuterVolumeSpecName: "kube-api-access-jvzqn") pod "bb7d7281-96c5-4a7d-b57f-769fbafba858" (UID: "bb7d7281-96c5-4a7d-b57f-769fbafba858"). InnerVolumeSpecName "kube-api-access-jvzqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.250344 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-kube-api-access-xrcmg" (OuterVolumeSpecName: "kube-api-access-xrcmg") pod "db3cf64b-1ff7-4792-94fa-4400e85bc1d6" (UID: "db3cf64b-1ff7-4792-94fa-4400e85bc1d6"). InnerVolumeSpecName "kube-api-access-xrcmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.253216 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb7d7281-96c5-4a7d-b57f-769fbafba858-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bb7d7281-96c5-4a7d-b57f-769fbafba858" (UID: "bb7d7281-96c5-4a7d-b57f-769fbafba858"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.267015 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-p6qxl"] Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.271627 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-p6qxl"] Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.324592 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg6wj\" (UniqueName: \"kubernetes.io/projected/5262687f-5e25-4632-8122-9e15fb72e8d9-kube-api-access-gg6wj\") pod \"openstack-operator-index-xptgj\" (UID: \"5262687f-5e25-4632-8122-9e15fb72e8d9\") " pod="openstack-operators/openstack-operator-index-xptgj" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.324668 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.324686 4913 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.324776 4913 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.324789 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7d7281-96c5-4a7d-b57f-769fbafba858-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.324800 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.324810 4913 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.324822 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrcmg\" (UniqueName: \"kubernetes.io/projected/db3cf64b-1ff7-4792-94fa-4400e85bc1d6-kube-api-access-xrcmg\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.324833 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvzqn\" (UniqueName: \"kubernetes.io/projected/bb7d7281-96c5-4a7d-b57f-769fbafba858-kube-api-access-jvzqn\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.324845 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb7d7281-96c5-4a7d-b57f-769fbafba858-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.339387 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg6wj\" (UniqueName: \"kubernetes.io/projected/5262687f-5e25-4632-8122-9e15fb72e8d9-kube-api-access-gg6wj\") pod \"openstack-operator-index-xptgj\" (UID: \"5262687f-5e25-4632-8122-9e15fb72e8d9\") " pod="openstack-operators/openstack-operator-index-xptgj" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.415525 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xptgj" Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.600199 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xptgj"] Oct 01 12:50:16 crc kubenswrapper[4913]: W1001 12:50:16.620481 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5262687f_5e25_4632_8122_9e15fb72e8d9.slice/crio-1e6c84bb6b2f731c192a915c436567cce401c0479010cbc1e5ab14e27fc78e59 WatchSource:0}: Error finding container 1e6c84bb6b2f731c192a915c436567cce401c0479010cbc1e5ab14e27fc78e59: Status 404 returned error can't find the container with id 1e6c84bb6b2f731c192a915c436567cce401c0479010cbc1e5ab14e27fc78e59 Oct 01 12:50:16 crc kubenswrapper[4913]: I1001 12:50:16.812653 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d963bd-d50c-4a8b-bd2d-75220df4c964" path="/var/lib/kubelet/pods/b7d963bd-d50c-4a8b-bd2d-75220df4c964/volumes" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.151033 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.151002 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph" event={"ID":"db3cf64b-1ff7-4792-94fa-4400e85bc1d6","Type":"ContainerDied","Data":"3324d0a7aec959692c76495f067854eb2e0273416856bc1b53619d452d5cc3b0"} Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.151224 4913 scope.go:117] "RemoveContainer" containerID="6de81adf1d32a551c07256d9bb6afc1dfe17c2edc6aea8303f53f43fbc2786a6" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.153644 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xptgj" event={"ID":"5262687f-5e25-4632-8122-9e15fb72e8d9","Type":"ContainerStarted","Data":"1e6c84bb6b2f731c192a915c436567cce401c0479010cbc1e5ab14e27fc78e59"} Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.155939 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" event={"ID":"bb7d7281-96c5-4a7d-b57f-769fbafba858","Type":"ContainerDied","Data":"824ea4dcc829ba1d015144f5c2015d0b6ff63b3594fb6ff7a71ce56551204f17"} Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.156106 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vwnnv" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.185000 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph"] Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.191589 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmgph"] Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.206737 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vwnnv"] Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.208476 4913 scope.go:117] "RemoveContainer" containerID="17d8224fd5e1c214f91d489f4b4d53e8755766f76af76a8b69c070345de98785" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.211809 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vwnnv"] Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.668472 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54c6847657-zw6fx"] Oct 01 12:50:17 crc kubenswrapper[4913]: E1001 12:50:17.668977 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb7d7281-96c5-4a7d-b57f-769fbafba858" containerName="controller-manager" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.668989 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7d7281-96c5-4a7d-b57f-769fbafba858" containerName="controller-manager" Oct 01 12:50:17 crc kubenswrapper[4913]: E1001 12:50:17.669003 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db3cf64b-1ff7-4792-94fa-4400e85bc1d6" containerName="route-controller-manager" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.669009 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="db3cf64b-1ff7-4792-94fa-4400e85bc1d6" containerName="route-controller-manager" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.669113 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="db3cf64b-1ff7-4792-94fa-4400e85bc1d6" containerName="route-controller-manager" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.669121 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb7d7281-96c5-4a7d-b57f-769fbafba858" containerName="controller-manager" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.669584 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.672402 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.672470 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.673232 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.673247 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.673782 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.674474 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.674766 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb"] Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.675644 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.681435 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.682023 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.682492 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.682600 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.682653 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.682839 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.686856 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.689376 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54c6847657-zw6fx"] Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.693087 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb"] Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.695986 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb"] Oct 01 12:50:17 crc kubenswrapper[4913]: E1001 12:50:17.726361 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-4h4zz serving-cert], unattached volumes=[], failed to process volumes=[client-ca config kube-api-access-4h4zz serving-cert]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" podUID="8b219eec-a0b8-4f69-99d6-f918ff030219" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.859689 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b219eec-a0b8-4f69-99d6-f918ff030219-client-ca\") pod \"route-controller-manager-75fd69bb5d-2fqsb\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.859736 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-serving-cert\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.859759 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b219eec-a0b8-4f69-99d6-f918ff030219-config\") pod \"route-controller-manager-75fd69bb5d-2fqsb\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.859784 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-config\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.859812 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-client-ca\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.859840 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xg8z\" (UniqueName: \"kubernetes.io/projected/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-kube-api-access-2xg8z\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.859862 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h4zz\" (UniqueName: \"kubernetes.io/projected/8b219eec-a0b8-4f69-99d6-f918ff030219-kube-api-access-4h4zz\") pod \"route-controller-manager-75fd69bb5d-2fqsb\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.859879 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-proxy-ca-bundles\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.859910 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b219eec-a0b8-4f69-99d6-f918ff030219-serving-cert\") pod \"route-controller-manager-75fd69bb5d-2fqsb\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.960644 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b219eec-a0b8-4f69-99d6-f918ff030219-serving-cert\") pod \"route-controller-manager-75fd69bb5d-2fqsb\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.960724 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b219eec-a0b8-4f69-99d6-f918ff030219-client-ca\") pod \"route-controller-manager-75fd69bb5d-2fqsb\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.960747 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-serving-cert\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.960775 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b219eec-a0b8-4f69-99d6-f918ff030219-config\") pod \"route-controller-manager-75fd69bb5d-2fqsb\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.960801 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-config\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.960825 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-client-ca\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.960853 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xg8z\" (UniqueName: \"kubernetes.io/projected/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-kube-api-access-2xg8z\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.960878 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h4zz\" (UniqueName: \"kubernetes.io/projected/8b219eec-a0b8-4f69-99d6-f918ff030219-kube-api-access-4h4zz\") pod \"route-controller-manager-75fd69bb5d-2fqsb\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.960894 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-proxy-ca-bundles\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.961815 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-proxy-ca-bundles\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.962377 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-client-ca\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.962614 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-config\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.963234 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b219eec-a0b8-4f69-99d6-f918ff030219-config\") pod \"route-controller-manager-75fd69bb5d-2fqsb\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.963602 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b219eec-a0b8-4f69-99d6-f918ff030219-client-ca\") pod \"route-controller-manager-75fd69bb5d-2fqsb\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.967125 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-serving-cert\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.969520 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b219eec-a0b8-4f69-99d6-f918ff030219-serving-cert\") pod \"route-controller-manager-75fd69bb5d-2fqsb\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.982902 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h4zz\" (UniqueName: \"kubernetes.io/projected/8b219eec-a0b8-4f69-99d6-f918ff030219-kube-api-access-4h4zz\") pod \"route-controller-manager-75fd69bb5d-2fqsb\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:17 crc kubenswrapper[4913]: I1001 12:50:17.990134 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xg8z\" (UniqueName: \"kubernetes.io/projected/c00eccb7-4d2b-495f-a9b4-c46d02977f6f-kube-api-access-2xg8z\") pod \"controller-manager-54c6847657-zw6fx\" (UID: \"c00eccb7-4d2b-495f-a9b4-c46d02977f6f\") " pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.010116 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.166982 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.167833 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xptgj" event={"ID":"5262687f-5e25-4632-8122-9e15fb72e8d9","Type":"ContainerStarted","Data":"7b6552fbae20eb75b15a365d368b3c97d65af733ba95e0bb8120b75c25ac2acd"} Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.189411 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.369815 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b219eec-a0b8-4f69-99d6-f918ff030219-serving-cert\") pod \"8b219eec-a0b8-4f69-99d6-f918ff030219\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.369886 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h4zz\" (UniqueName: \"kubernetes.io/projected/8b219eec-a0b8-4f69-99d6-f918ff030219-kube-api-access-4h4zz\") pod \"8b219eec-a0b8-4f69-99d6-f918ff030219\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.369926 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b219eec-a0b8-4f69-99d6-f918ff030219-client-ca\") pod \"8b219eec-a0b8-4f69-99d6-f918ff030219\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.370047 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b219eec-a0b8-4f69-99d6-f918ff030219-config\") pod \"8b219eec-a0b8-4f69-99d6-f918ff030219\" (UID: \"8b219eec-a0b8-4f69-99d6-f918ff030219\") " Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.371023 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b219eec-a0b8-4f69-99d6-f918ff030219-config" (OuterVolumeSpecName: "config") pod "8b219eec-a0b8-4f69-99d6-f918ff030219" (UID: "8b219eec-a0b8-4f69-99d6-f918ff030219"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.371213 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b219eec-a0b8-4f69-99d6-f918ff030219-client-ca" (OuterVolumeSpecName: "client-ca") pod "8b219eec-a0b8-4f69-99d6-f918ff030219" (UID: "8b219eec-a0b8-4f69-99d6-f918ff030219"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.374638 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b219eec-a0b8-4f69-99d6-f918ff030219-kube-api-access-4h4zz" (OuterVolumeSpecName: "kube-api-access-4h4zz") pod "8b219eec-a0b8-4f69-99d6-f918ff030219" (UID: "8b219eec-a0b8-4f69-99d6-f918ff030219"). InnerVolumeSpecName "kube-api-access-4h4zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.383501 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b219eec-a0b8-4f69-99d6-f918ff030219-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8b219eec-a0b8-4f69-99d6-f918ff030219" (UID: "8b219eec-a0b8-4f69-99d6-f918ff030219"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.471949 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b219eec-a0b8-4f69-99d6-f918ff030219-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.472016 4913 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b219eec-a0b8-4f69-99d6-f918ff030219-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.472032 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h4zz\" (UniqueName: \"kubernetes.io/projected/8b219eec-a0b8-4f69-99d6-f918ff030219-kube-api-access-4h4zz\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.472046 4913 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b219eec-a0b8-4f69-99d6-f918ff030219-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.479259 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xptgj" podStartSLOduration=1.986707802 podStartE2EDuration="2.479243089s" podCreationTimestamp="2025-10-01 12:50:16 +0000 UTC" firstStartedPulling="2025-10-01 12:50:16.628408852 +0000 UTC m=+748.531884430" lastFinishedPulling="2025-10-01 12:50:17.120944139 +0000 UTC m=+749.024419717" observedRunningTime="2025-10-01 12:50:18.188845769 +0000 UTC m=+750.092321357" watchObservedRunningTime="2025-10-01 12:50:18.479243089 +0000 UTC m=+750.382718667" Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.482779 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54c6847657-zw6fx"] Oct 01 12:50:18 crc kubenswrapper[4913]: W1001 12:50:18.493014 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc00eccb7_4d2b_495f_a9b4_c46d02977f6f.slice/crio-38f97c281b3eb71643880ea7c5a821ae9caee10ab30713b3ceb7c2190e0c0012 WatchSource:0}: Error finding container 38f97c281b3eb71643880ea7c5a821ae9caee10ab30713b3ceb7c2190e0c0012: Status 404 returned error can't find the container with id 38f97c281b3eb71643880ea7c5a821ae9caee10ab30713b3ceb7c2190e0c0012 Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.813554 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb7d7281-96c5-4a7d-b57f-769fbafba858" path="/var/lib/kubelet/pods/bb7d7281-96c5-4a7d-b57f-769fbafba858/volumes" Oct 01 12:50:18 crc kubenswrapper[4913]: I1001 12:50:18.814198 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db3cf64b-1ff7-4792-94fa-4400e85bc1d6" path="/var/lib/kubelet/pods/db3cf64b-1ff7-4792-94fa-4400e85bc1d6/volumes" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.174812 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.174804 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" event={"ID":"c00eccb7-4d2b-495f-a9b4-c46d02977f6f","Type":"ContainerStarted","Data":"1e37b5d1b846c3d2ba13b775fbcfe562e840a27b3b5d9d8afcb2551711652c5f"} Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.174942 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" event={"ID":"c00eccb7-4d2b-495f-a9b4-c46d02977f6f","Type":"ContainerStarted","Data":"38f97c281b3eb71643880ea7c5a821ae9caee10ab30713b3ceb7c2190e0c0012"} Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.208261 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" podStartSLOduration=4.208139981 podStartE2EDuration="4.208139981s" podCreationTimestamp="2025-10-01 12:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:50:19.198167495 +0000 UTC m=+751.101643093" watchObservedRunningTime="2025-10-01 12:50:19.208139981 +0000 UTC m=+751.111615549" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.246620 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn"] Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.247637 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.249948 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.250050 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.254290 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb"] Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.254805 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.255060 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.255184 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.255628 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.260340 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fd69bb5d-2fqsb"] Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.263500 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn"] Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.289064 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kl65\" (UniqueName: \"kubernetes.io/projected/a01d0425-4a64-49ab-a202-e8c254343dd2-kube-api-access-7kl65\") pod \"route-controller-manager-645f75cf97-fktbn\" (UID: \"a01d0425-4a64-49ab-a202-e8c254343dd2\") " pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.289176 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a01d0425-4a64-49ab-a202-e8c254343dd2-serving-cert\") pod \"route-controller-manager-645f75cf97-fktbn\" (UID: \"a01d0425-4a64-49ab-a202-e8c254343dd2\") " pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.289206 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01d0425-4a64-49ab-a202-e8c254343dd2-config\") pod \"route-controller-manager-645f75cf97-fktbn\" (UID: \"a01d0425-4a64-49ab-a202-e8c254343dd2\") " pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.289232 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a01d0425-4a64-49ab-a202-e8c254343dd2-client-ca\") pod \"route-controller-manager-645f75cf97-fktbn\" (UID: \"a01d0425-4a64-49ab-a202-e8c254343dd2\") " pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.389621 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a01d0425-4a64-49ab-a202-e8c254343dd2-client-ca\") pod \"route-controller-manager-645f75cf97-fktbn\" (UID: \"a01d0425-4a64-49ab-a202-e8c254343dd2\") " pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.390469 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a01d0425-4a64-49ab-a202-e8c254343dd2-client-ca\") pod \"route-controller-manager-645f75cf97-fktbn\" (UID: \"a01d0425-4a64-49ab-a202-e8c254343dd2\") " pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.390593 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kl65\" (UniqueName: \"kubernetes.io/projected/a01d0425-4a64-49ab-a202-e8c254343dd2-kube-api-access-7kl65\") pod \"route-controller-manager-645f75cf97-fktbn\" (UID: \"a01d0425-4a64-49ab-a202-e8c254343dd2\") " pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.390902 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a01d0425-4a64-49ab-a202-e8c254343dd2-serving-cert\") pod \"route-controller-manager-645f75cf97-fktbn\" (UID: \"a01d0425-4a64-49ab-a202-e8c254343dd2\") " pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.390930 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01d0425-4a64-49ab-a202-e8c254343dd2-config\") pod \"route-controller-manager-645f75cf97-fktbn\" (UID: \"a01d0425-4a64-49ab-a202-e8c254343dd2\") " pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.393894 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01d0425-4a64-49ab-a202-e8c254343dd2-config\") pod \"route-controller-manager-645f75cf97-fktbn\" (UID: \"a01d0425-4a64-49ab-a202-e8c254343dd2\") " pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.402008 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a01d0425-4a64-49ab-a202-e8c254343dd2-serving-cert\") pod \"route-controller-manager-645f75cf97-fktbn\" (UID: \"a01d0425-4a64-49ab-a202-e8c254343dd2\") " pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.411240 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kl65\" (UniqueName: \"kubernetes.io/projected/a01d0425-4a64-49ab-a202-e8c254343dd2-kube-api-access-7kl65\") pod \"route-controller-manager-645f75cf97-fktbn\" (UID: \"a01d0425-4a64-49ab-a202-e8c254343dd2\") " pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.564924 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:19 crc kubenswrapper[4913]: I1001 12:50:19.983062 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn"] Oct 01 12:50:19 crc kubenswrapper[4913]: W1001 12:50:19.993188 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda01d0425_4a64_49ab_a202_e8c254343dd2.slice/crio-13c81a5a2f04e1b817a063c9307a7c2bd312ec8d6953ebf05dc06a8b8168350e WatchSource:0}: Error finding container 13c81a5a2f04e1b817a063c9307a7c2bd312ec8d6953ebf05dc06a8b8168350e: Status 404 returned error can't find the container with id 13c81a5a2f04e1b817a063c9307a7c2bd312ec8d6953ebf05dc06a8b8168350e Oct 01 12:50:20 crc kubenswrapper[4913]: I1001 12:50:20.180736 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" event={"ID":"a01d0425-4a64-49ab-a202-e8c254343dd2","Type":"ContainerStarted","Data":"b3e5f4071df90a6710d85506a9530bb1ea4a7310c78690150c04b7c918d068a6"} Oct 01 12:50:20 crc kubenswrapper[4913]: I1001 12:50:20.180789 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" event={"ID":"a01d0425-4a64-49ab-a202-e8c254343dd2","Type":"ContainerStarted","Data":"13c81a5a2f04e1b817a063c9307a7c2bd312ec8d6953ebf05dc06a8b8168350e"} Oct 01 12:50:20 crc kubenswrapper[4913]: I1001 12:50:20.180915 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:20 crc kubenswrapper[4913]: I1001 12:50:20.180938 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:20 crc kubenswrapper[4913]: I1001 12:50:20.182409 4913 patch_prober.go:28] interesting pod/route-controller-manager-645f75cf97-fktbn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Oct 01 12:50:20 crc kubenswrapper[4913]: I1001 12:50:20.182501 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" podUID="a01d0425-4a64-49ab-a202-e8c254343dd2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Oct 01 12:50:20 crc kubenswrapper[4913]: I1001 12:50:20.188424 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54c6847657-zw6fx" Oct 01 12:50:20 crc kubenswrapper[4913]: I1001 12:50:20.215730 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" podStartSLOduration=3.21570629 podStartE2EDuration="3.21570629s" podCreationTimestamp="2025-10-01 12:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:50:20.196478257 +0000 UTC m=+752.099953875" watchObservedRunningTime="2025-10-01 12:50:20.21570629 +0000 UTC m=+752.119181888" Oct 01 12:50:20 crc kubenswrapper[4913]: I1001 12:50:20.814672 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b219eec-a0b8-4f69-99d6-f918ff030219" path="/var/lib/kubelet/pods/8b219eec-a0b8-4f69-99d6-f918ff030219/volumes" Oct 01 12:50:21 crc kubenswrapper[4913]: I1001 12:50:21.197739 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-645f75cf97-fktbn" Oct 01 12:50:21 crc kubenswrapper[4913]: I1001 12:50:21.941420 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-ksptf" Oct 01 12:50:23 crc kubenswrapper[4913]: I1001 12:50:23.101139 4913 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 12:50:26 crc kubenswrapper[4913]: I1001 12:50:26.415787 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-xptgj" Oct 01 12:50:26 crc kubenswrapper[4913]: I1001 12:50:26.417103 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-xptgj" Oct 01 12:50:26 crc kubenswrapper[4913]: I1001 12:50:26.459309 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-xptgj" Oct 01 12:50:27 crc kubenswrapper[4913]: I1001 12:50:27.256712 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-xptgj" Oct 01 12:50:28 crc kubenswrapper[4913]: I1001 12:50:28.926868 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt"] Oct 01 12:50:28 crc kubenswrapper[4913]: I1001 12:50:28.928250 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" Oct 01 12:50:28 crc kubenswrapper[4913]: I1001 12:50:28.930385 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-qpjjs" Oct 01 12:50:28 crc kubenswrapper[4913]: I1001 12:50:28.936800 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt"] Oct 01 12:50:29 crc kubenswrapper[4913]: I1001 12:50:29.034919 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/848f3124-2a1f-45fa-bb83-893c3db866ae-util\") pod \"0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt\" (UID: \"848f3124-2a1f-45fa-bb83-893c3db866ae\") " pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" Oct 01 12:50:29 crc kubenswrapper[4913]: I1001 12:50:29.034987 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/848f3124-2a1f-45fa-bb83-893c3db866ae-bundle\") pod \"0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt\" (UID: \"848f3124-2a1f-45fa-bb83-893c3db866ae\") " pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" Oct 01 12:50:29 crc kubenswrapper[4913]: I1001 12:50:29.035087 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx86l\" (UniqueName: \"kubernetes.io/projected/848f3124-2a1f-45fa-bb83-893c3db866ae-kube-api-access-sx86l\") pod \"0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt\" (UID: \"848f3124-2a1f-45fa-bb83-893c3db866ae\") " pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" Oct 01 12:50:29 crc kubenswrapper[4913]: I1001 12:50:29.136551 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/848f3124-2a1f-45fa-bb83-893c3db866ae-util\") pod \"0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt\" (UID: \"848f3124-2a1f-45fa-bb83-893c3db866ae\") " pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" Oct 01 12:50:29 crc kubenswrapper[4913]: I1001 12:50:29.136609 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/848f3124-2a1f-45fa-bb83-893c3db866ae-bundle\") pod \"0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt\" (UID: \"848f3124-2a1f-45fa-bb83-893c3db866ae\") " pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" Oct 01 12:50:29 crc kubenswrapper[4913]: I1001 12:50:29.136979 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx86l\" (UniqueName: \"kubernetes.io/projected/848f3124-2a1f-45fa-bb83-893c3db866ae-kube-api-access-sx86l\") pod \"0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt\" (UID: \"848f3124-2a1f-45fa-bb83-893c3db866ae\") " pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" Oct 01 12:50:29 crc kubenswrapper[4913]: I1001 12:50:29.137479 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/848f3124-2a1f-45fa-bb83-893c3db866ae-util\") pod \"0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt\" (UID: \"848f3124-2a1f-45fa-bb83-893c3db866ae\") " pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" Oct 01 12:50:29 crc kubenswrapper[4913]: I1001 12:50:29.137571 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/848f3124-2a1f-45fa-bb83-893c3db866ae-bundle\") pod \"0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt\" (UID: \"848f3124-2a1f-45fa-bb83-893c3db866ae\") " pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" Oct 01 12:50:29 crc kubenswrapper[4913]: I1001 12:50:29.164820 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx86l\" (UniqueName: \"kubernetes.io/projected/848f3124-2a1f-45fa-bb83-893c3db866ae-kube-api-access-sx86l\") pod \"0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt\" (UID: \"848f3124-2a1f-45fa-bb83-893c3db866ae\") " pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" Oct 01 12:50:29 crc kubenswrapper[4913]: I1001 12:50:29.268421 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" Oct 01 12:50:29 crc kubenswrapper[4913]: I1001 12:50:29.692523 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt"] Oct 01 12:50:29 crc kubenswrapper[4913]: W1001 12:50:29.706926 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod848f3124_2a1f_45fa_bb83_893c3db866ae.slice/crio-2b98f8e3576dd87882644f27ada34548ed330aed0d8a01ecfd00f40d0f8a1478 WatchSource:0}: Error finding container 2b98f8e3576dd87882644f27ada34548ed330aed0d8a01ecfd00f40d0f8a1478: Status 404 returned error can't find the container with id 2b98f8e3576dd87882644f27ada34548ed330aed0d8a01ecfd00f40d0f8a1478 Oct 01 12:50:30 crc kubenswrapper[4913]: I1001 12:50:30.252657 4913 generic.go:334] "Generic (PLEG): container finished" podID="848f3124-2a1f-45fa-bb83-893c3db866ae" containerID="48ccd3666d50cca4d19bb9f7fe0d3c2021e6a8fc876853f305d2f35167e372f2" exitCode=0 Oct 01 12:50:30 crc kubenswrapper[4913]: I1001 12:50:30.252784 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" event={"ID":"848f3124-2a1f-45fa-bb83-893c3db866ae","Type":"ContainerDied","Data":"48ccd3666d50cca4d19bb9f7fe0d3c2021e6a8fc876853f305d2f35167e372f2"} Oct 01 12:50:30 crc kubenswrapper[4913]: I1001 12:50:30.253251 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" event={"ID":"848f3124-2a1f-45fa-bb83-893c3db866ae","Type":"ContainerStarted","Data":"2b98f8e3576dd87882644f27ada34548ed330aed0d8a01ecfd00f40d0f8a1478"} Oct 01 12:50:32 crc kubenswrapper[4913]: I1001 12:50:32.267984 4913 generic.go:334] "Generic (PLEG): container finished" podID="848f3124-2a1f-45fa-bb83-893c3db866ae" containerID="a7ee1f1f9544b119a26c0db80fcd340a328c0c0440e448542622d5bf2940872d" exitCode=0 Oct 01 12:50:32 crc kubenswrapper[4913]: I1001 12:50:32.268137 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" event={"ID":"848f3124-2a1f-45fa-bb83-893c3db866ae","Type":"ContainerDied","Data":"a7ee1f1f9544b119a26c0db80fcd340a328c0c0440e448542622d5bf2940872d"} Oct 01 12:50:33 crc kubenswrapper[4913]: I1001 12:50:33.280044 4913 generic.go:334] "Generic (PLEG): container finished" podID="848f3124-2a1f-45fa-bb83-893c3db866ae" containerID="ba01c36eab4fbe2203d6e765275eb4243c7cccd4e638962b4fc4139c3a6bc938" exitCode=0 Oct 01 12:50:33 crc kubenswrapper[4913]: I1001 12:50:33.280501 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" event={"ID":"848f3124-2a1f-45fa-bb83-893c3db866ae","Type":"ContainerDied","Data":"ba01c36eab4fbe2203d6e765275eb4243c7cccd4e638962b4fc4139c3a6bc938"} Oct 01 12:50:34 crc kubenswrapper[4913]: I1001 12:50:34.702372 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" Oct 01 12:50:34 crc kubenswrapper[4913]: I1001 12:50:34.819475 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx86l\" (UniqueName: \"kubernetes.io/projected/848f3124-2a1f-45fa-bb83-893c3db866ae-kube-api-access-sx86l\") pod \"848f3124-2a1f-45fa-bb83-893c3db866ae\" (UID: \"848f3124-2a1f-45fa-bb83-893c3db866ae\") " Oct 01 12:50:34 crc kubenswrapper[4913]: I1001 12:50:34.819698 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/848f3124-2a1f-45fa-bb83-893c3db866ae-bundle\") pod \"848f3124-2a1f-45fa-bb83-893c3db866ae\" (UID: \"848f3124-2a1f-45fa-bb83-893c3db866ae\") " Oct 01 12:50:34 crc kubenswrapper[4913]: I1001 12:50:34.819740 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/848f3124-2a1f-45fa-bb83-893c3db866ae-util\") pod \"848f3124-2a1f-45fa-bb83-893c3db866ae\" (UID: \"848f3124-2a1f-45fa-bb83-893c3db866ae\") " Oct 01 12:50:34 crc kubenswrapper[4913]: I1001 12:50:34.820552 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/848f3124-2a1f-45fa-bb83-893c3db866ae-bundle" (OuterVolumeSpecName: "bundle") pod "848f3124-2a1f-45fa-bb83-893c3db866ae" (UID: "848f3124-2a1f-45fa-bb83-893c3db866ae"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:50:34 crc kubenswrapper[4913]: I1001 12:50:34.827534 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848f3124-2a1f-45fa-bb83-893c3db866ae-kube-api-access-sx86l" (OuterVolumeSpecName: "kube-api-access-sx86l") pod "848f3124-2a1f-45fa-bb83-893c3db866ae" (UID: "848f3124-2a1f-45fa-bb83-893c3db866ae"). InnerVolumeSpecName "kube-api-access-sx86l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:50:34 crc kubenswrapper[4913]: I1001 12:50:34.840556 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/848f3124-2a1f-45fa-bb83-893c3db866ae-util" (OuterVolumeSpecName: "util") pod "848f3124-2a1f-45fa-bb83-893c3db866ae" (UID: "848f3124-2a1f-45fa-bb83-893c3db866ae"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:50:34 crc kubenswrapper[4913]: I1001 12:50:34.920887 4913 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/848f3124-2a1f-45fa-bb83-893c3db866ae-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:34 crc kubenswrapper[4913]: I1001 12:50:34.920930 4913 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/848f3124-2a1f-45fa-bb83-893c3db866ae-util\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:34 crc kubenswrapper[4913]: I1001 12:50:34.920942 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx86l\" (UniqueName: \"kubernetes.io/projected/848f3124-2a1f-45fa-bb83-893c3db866ae-kube-api-access-sx86l\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:35 crc kubenswrapper[4913]: I1001 12:50:35.296966 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" event={"ID":"848f3124-2a1f-45fa-bb83-893c3db866ae","Type":"ContainerDied","Data":"2b98f8e3576dd87882644f27ada34548ed330aed0d8a01ecfd00f40d0f8a1478"} Oct 01 12:50:35 crc kubenswrapper[4913]: I1001 12:50:35.297036 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b98f8e3576dd87882644f27ada34548ed330aed0d8a01ecfd00f40d0f8a1478" Oct 01 12:50:35 crc kubenswrapper[4913]: I1001 12:50:35.297259 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.113045 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f68lm"] Oct 01 12:50:36 crc kubenswrapper[4913]: E1001 12:50:36.113249 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848f3124-2a1f-45fa-bb83-893c3db866ae" containerName="pull" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.113260 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="848f3124-2a1f-45fa-bb83-893c3db866ae" containerName="pull" Oct 01 12:50:36 crc kubenswrapper[4913]: E1001 12:50:36.113312 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848f3124-2a1f-45fa-bb83-893c3db866ae" containerName="extract" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.113319 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="848f3124-2a1f-45fa-bb83-893c3db866ae" containerName="extract" Oct 01 12:50:36 crc kubenswrapper[4913]: E1001 12:50:36.113338 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848f3124-2a1f-45fa-bb83-893c3db866ae" containerName="util" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.113343 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="848f3124-2a1f-45fa-bb83-893c3db866ae" containerName="util" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.113440 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="848f3124-2a1f-45fa-bb83-893c3db866ae" containerName="extract" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.114329 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.124920 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f68lm"] Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.137836 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpqtt\" (UniqueName: \"kubernetes.io/projected/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-kube-api-access-lpqtt\") pod \"redhat-marketplace-f68lm\" (UID: \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\") " pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.137984 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-utilities\") pod \"redhat-marketplace-f68lm\" (UID: \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\") " pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.138021 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-catalog-content\") pod \"redhat-marketplace-f68lm\" (UID: \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\") " pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.238999 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-utilities\") pod \"redhat-marketplace-f68lm\" (UID: \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\") " pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.239048 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-catalog-content\") pod \"redhat-marketplace-f68lm\" (UID: \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\") " pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.239088 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpqtt\" (UniqueName: \"kubernetes.io/projected/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-kube-api-access-lpqtt\") pod \"redhat-marketplace-f68lm\" (UID: \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\") " pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.239781 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-utilities\") pod \"redhat-marketplace-f68lm\" (UID: \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\") " pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.239859 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-catalog-content\") pod \"redhat-marketplace-f68lm\" (UID: \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\") " pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.259117 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpqtt\" (UniqueName: \"kubernetes.io/projected/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-kube-api-access-lpqtt\") pod \"redhat-marketplace-f68lm\" (UID: \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\") " pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.448572 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:36 crc kubenswrapper[4913]: I1001 12:50:36.864705 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f68lm"] Oct 01 12:50:37 crc kubenswrapper[4913]: I1001 12:50:37.321235 4913 generic.go:334] "Generic (PLEG): container finished" podID="6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" containerID="861e11b6d5ecef6d7a5d9979dcf4e78c7ecb48d0db22fa0a138e986dfd40c938" exitCode=0 Oct 01 12:50:37 crc kubenswrapper[4913]: I1001 12:50:37.321296 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f68lm" event={"ID":"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02","Type":"ContainerDied","Data":"861e11b6d5ecef6d7a5d9979dcf4e78c7ecb48d0db22fa0a138e986dfd40c938"} Oct 01 12:50:37 crc kubenswrapper[4913]: I1001 12:50:37.321323 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f68lm" event={"ID":"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02","Type":"ContainerStarted","Data":"8bc11b182ae2e25f622d739db60df9aa8b0ce1164a9ce501bfb4b8fb90c71643"} Oct 01 12:50:39 crc kubenswrapper[4913]: I1001 12:50:39.340051 4913 generic.go:334] "Generic (PLEG): container finished" podID="6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" containerID="1461ea435b9efa4753d92d1ac7629f1a64ae8eef56da433b503e06c5a82c63f7" exitCode=0 Oct 01 12:50:39 crc kubenswrapper[4913]: I1001 12:50:39.340101 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f68lm" event={"ID":"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02","Type":"ContainerDied","Data":"1461ea435b9efa4753d92d1ac7629f1a64ae8eef56da433b503e06c5a82c63f7"} Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.083315 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.083359 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.083390 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.083793 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"770bb111d4d76e645ce0db85174ab71c7357ffc9ba302bee6b549ccfcb148bea"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.083841 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://770bb111d4d76e645ce0db85174ab71c7357ffc9ba302bee6b549ccfcb148bea" gracePeriod=600 Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.286422 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57"] Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.287684 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.289780 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-qtzjz" Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.325714 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57"] Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.365380 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f68lm" event={"ID":"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02","Type":"ContainerStarted","Data":"7c36659d565bdb29cd3a015f105d4a650afc549ef30ff47d8fd7c212fd712f25"} Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.370023 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="770bb111d4d76e645ce0db85174ab71c7357ffc9ba302bee6b549ccfcb148bea" exitCode=0 Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.370061 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"770bb111d4d76e645ce0db85174ab71c7357ffc9ba302bee6b549ccfcb148bea"} Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.370103 4913 scope.go:117] "RemoveContainer" containerID="1cc68559427fcac3c481bc75066fa47eb6ec40478fc203e9f25d7c355d20a2fd" Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.386108 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f68lm" podStartSLOduration=1.95805469 podStartE2EDuration="4.38608883s" podCreationTimestamp="2025-10-01 12:50:36 +0000 UTC" firstStartedPulling="2025-10-01 12:50:37.323992124 +0000 UTC m=+769.227467702" lastFinishedPulling="2025-10-01 12:50:39.752026264 +0000 UTC m=+771.655501842" observedRunningTime="2025-10-01 12:50:40.382234694 +0000 UTC m=+772.285710292" watchObservedRunningTime="2025-10-01 12:50:40.38608883 +0000 UTC m=+772.289564408" Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.399208 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjj4h\" (UniqueName: \"kubernetes.io/projected/d54b747c-e08e-4b5c-b0ea-74651018cc09-kube-api-access-fjj4h\") pod \"openstack-operator-controller-operator-676c66f88b-7nb57\" (UID: \"d54b747c-e08e-4b5c-b0ea-74651018cc09\") " pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.500858 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjj4h\" (UniqueName: \"kubernetes.io/projected/d54b747c-e08e-4b5c-b0ea-74651018cc09-kube-api-access-fjj4h\") pod \"openstack-operator-controller-operator-676c66f88b-7nb57\" (UID: \"d54b747c-e08e-4b5c-b0ea-74651018cc09\") " pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.529244 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjj4h\" (UniqueName: \"kubernetes.io/projected/d54b747c-e08e-4b5c-b0ea-74651018cc09-kube-api-access-fjj4h\") pod \"openstack-operator-controller-operator-676c66f88b-7nb57\" (UID: \"d54b747c-e08e-4b5c-b0ea-74651018cc09\") " pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" Oct 01 12:50:40 crc kubenswrapper[4913]: I1001 12:50:40.622137 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" Oct 01 12:50:41 crc kubenswrapper[4913]: I1001 12:50:41.082557 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57"] Oct 01 12:50:41 crc kubenswrapper[4913]: I1001 12:50:41.379091 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"799ca569c504b87f0203003c8051d299d1a44d32ea3031c0c1940d1be3fbaa96"} Oct 01 12:50:41 crc kubenswrapper[4913]: I1001 12:50:41.381246 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" event={"ID":"d54b747c-e08e-4b5c-b0ea-74651018cc09","Type":"ContainerStarted","Data":"d6259bf80a40757815bfe27206adbbee49d52fb33b5b5d5a0cc1e1372f4cfd4f"} Oct 01 12:50:45 crc kubenswrapper[4913]: I1001 12:50:45.410984 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" event={"ID":"d54b747c-e08e-4b5c-b0ea-74651018cc09","Type":"ContainerStarted","Data":"ba0649fae8ccbf4a91a923579d5ba385abe3cff70aa185780748226971ea2f51"} Oct 01 12:50:46 crc kubenswrapper[4913]: I1001 12:50:46.448841 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:46 crc kubenswrapper[4913]: I1001 12:50:46.449753 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:46 crc kubenswrapper[4913]: I1001 12:50:46.523837 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:47 crc kubenswrapper[4913]: I1001 12:50:47.480069 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:48 crc kubenswrapper[4913]: I1001 12:50:48.429514 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" event={"ID":"d54b747c-e08e-4b5c-b0ea-74651018cc09","Type":"ContainerStarted","Data":"f245fb80f640141d9ea93f59166cf3908fd0e33bd29a5d1508d345c1fb6647e9"} Oct 01 12:50:48 crc kubenswrapper[4913]: I1001 12:50:48.429935 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" Oct 01 12:50:48 crc kubenswrapper[4913]: I1001 12:50:48.480067 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" podStartSLOduration=2.185263424 podStartE2EDuration="8.48005201s" podCreationTimestamp="2025-10-01 12:50:40 +0000 UTC" firstStartedPulling="2025-10-01 12:50:41.086899565 +0000 UTC m=+772.990375173" lastFinishedPulling="2025-10-01 12:50:47.381688181 +0000 UTC m=+779.285163759" observedRunningTime="2025-10-01 12:50:48.477499039 +0000 UTC m=+780.380974707" watchObservedRunningTime="2025-10-01 12:50:48.48005201 +0000 UTC m=+780.383527588" Oct 01 12:50:48 crc kubenswrapper[4913]: I1001 12:50:48.485320 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f68lm"] Oct 01 12:50:49 crc kubenswrapper[4913]: I1001 12:50:49.436118 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f68lm" podUID="6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" containerName="registry-server" containerID="cri-o://7c36659d565bdb29cd3a015f105d4a650afc549ef30ff47d8fd7c212fd712f25" gracePeriod=2 Oct 01 12:50:49 crc kubenswrapper[4913]: I1001 12:50:49.970871 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.126706 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpqtt\" (UniqueName: \"kubernetes.io/projected/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-kube-api-access-lpqtt\") pod \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\" (UID: \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\") " Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.126867 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-utilities\") pod \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\" (UID: \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\") " Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.126902 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-catalog-content\") pod \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\" (UID: \"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02\") " Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.128054 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-utilities" (OuterVolumeSpecName: "utilities") pod "6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" (UID: "6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.133303 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-kube-api-access-lpqtt" (OuterVolumeSpecName: "kube-api-access-lpqtt") pod "6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" (UID: "6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02"). InnerVolumeSpecName "kube-api-access-lpqtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.152495 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" (UID: "6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.229634 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpqtt\" (UniqueName: \"kubernetes.io/projected/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-kube-api-access-lpqtt\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.229681 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.229702 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.446327 4913 generic.go:334] "Generic (PLEG): container finished" podID="6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" containerID="7c36659d565bdb29cd3a015f105d4a650afc549ef30ff47d8fd7c212fd712f25" exitCode=0 Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.446402 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f68lm" event={"ID":"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02","Type":"ContainerDied","Data":"7c36659d565bdb29cd3a015f105d4a650afc549ef30ff47d8fd7c212fd712f25"} Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.446451 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f68lm" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.446654 4913 scope.go:117] "RemoveContainer" containerID="7c36659d565bdb29cd3a015f105d4a650afc549ef30ff47d8fd7c212fd712f25" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.446637 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f68lm" event={"ID":"6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02","Type":"ContainerDied","Data":"8bc11b182ae2e25f622d739db60df9aa8b0ce1164a9ce501bfb4b8fb90c71643"} Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.472024 4913 scope.go:117] "RemoveContainer" containerID="1461ea435b9efa4753d92d1ac7629f1a64ae8eef56da433b503e06c5a82c63f7" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.493744 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f68lm"] Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.501431 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f68lm"] Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.502008 4913 scope.go:117] "RemoveContainer" containerID="861e11b6d5ecef6d7a5d9979dcf4e78c7ecb48d0db22fa0a138e986dfd40c938" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.522876 4913 scope.go:117] "RemoveContainer" containerID="7c36659d565bdb29cd3a015f105d4a650afc549ef30ff47d8fd7c212fd712f25" Oct 01 12:50:50 crc kubenswrapper[4913]: E1001 12:50:50.523399 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c36659d565bdb29cd3a015f105d4a650afc549ef30ff47d8fd7c212fd712f25\": container with ID starting with 7c36659d565bdb29cd3a015f105d4a650afc549ef30ff47d8fd7c212fd712f25 not found: ID does not exist" containerID="7c36659d565bdb29cd3a015f105d4a650afc549ef30ff47d8fd7c212fd712f25" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.523429 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c36659d565bdb29cd3a015f105d4a650afc549ef30ff47d8fd7c212fd712f25"} err="failed to get container status \"7c36659d565bdb29cd3a015f105d4a650afc549ef30ff47d8fd7c212fd712f25\": rpc error: code = NotFound desc = could not find container \"7c36659d565bdb29cd3a015f105d4a650afc549ef30ff47d8fd7c212fd712f25\": container with ID starting with 7c36659d565bdb29cd3a015f105d4a650afc549ef30ff47d8fd7c212fd712f25 not found: ID does not exist" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.523449 4913 scope.go:117] "RemoveContainer" containerID="1461ea435b9efa4753d92d1ac7629f1a64ae8eef56da433b503e06c5a82c63f7" Oct 01 12:50:50 crc kubenswrapper[4913]: E1001 12:50:50.523650 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1461ea435b9efa4753d92d1ac7629f1a64ae8eef56da433b503e06c5a82c63f7\": container with ID starting with 1461ea435b9efa4753d92d1ac7629f1a64ae8eef56da433b503e06c5a82c63f7 not found: ID does not exist" containerID="1461ea435b9efa4753d92d1ac7629f1a64ae8eef56da433b503e06c5a82c63f7" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.523667 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1461ea435b9efa4753d92d1ac7629f1a64ae8eef56da433b503e06c5a82c63f7"} err="failed to get container status \"1461ea435b9efa4753d92d1ac7629f1a64ae8eef56da433b503e06c5a82c63f7\": rpc error: code = NotFound desc = could not find container \"1461ea435b9efa4753d92d1ac7629f1a64ae8eef56da433b503e06c5a82c63f7\": container with ID starting with 1461ea435b9efa4753d92d1ac7629f1a64ae8eef56da433b503e06c5a82c63f7 not found: ID does not exist" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.523679 4913 scope.go:117] "RemoveContainer" containerID="861e11b6d5ecef6d7a5d9979dcf4e78c7ecb48d0db22fa0a138e986dfd40c938" Oct 01 12:50:50 crc kubenswrapper[4913]: E1001 12:50:50.524467 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861e11b6d5ecef6d7a5d9979dcf4e78c7ecb48d0db22fa0a138e986dfd40c938\": container with ID starting with 861e11b6d5ecef6d7a5d9979dcf4e78c7ecb48d0db22fa0a138e986dfd40c938 not found: ID does not exist" containerID="861e11b6d5ecef6d7a5d9979dcf4e78c7ecb48d0db22fa0a138e986dfd40c938" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.524547 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861e11b6d5ecef6d7a5d9979dcf4e78c7ecb48d0db22fa0a138e986dfd40c938"} err="failed to get container status \"861e11b6d5ecef6d7a5d9979dcf4e78c7ecb48d0db22fa0a138e986dfd40c938\": rpc error: code = NotFound desc = could not find container \"861e11b6d5ecef6d7a5d9979dcf4e78c7ecb48d0db22fa0a138e986dfd40c938\": container with ID starting with 861e11b6d5ecef6d7a5d9979dcf4e78c7ecb48d0db22fa0a138e986dfd40c938 not found: ID does not exist" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.626096 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" Oct 01 12:50:50 crc kubenswrapper[4913]: I1001 12:50:50.824014 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" path="/var/lib/kubelet/pods/6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02/volumes" Oct 01 12:50:52 crc kubenswrapper[4913]: I1001 12:50:52.890765 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8rv85"] Oct 01 12:50:52 crc kubenswrapper[4913]: E1001 12:50:52.891421 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" containerName="extract-content" Oct 01 12:50:52 crc kubenswrapper[4913]: I1001 12:50:52.891438 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" containerName="extract-content" Oct 01 12:50:52 crc kubenswrapper[4913]: E1001 12:50:52.891473 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" containerName="extract-utilities" Oct 01 12:50:52 crc kubenswrapper[4913]: I1001 12:50:52.891481 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" containerName="extract-utilities" Oct 01 12:50:52 crc kubenswrapper[4913]: E1001 12:50:52.891491 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" containerName="registry-server" Oct 01 12:50:52 crc kubenswrapper[4913]: I1001 12:50:52.891501 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" containerName="registry-server" Oct 01 12:50:52 crc kubenswrapper[4913]: I1001 12:50:52.891626 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3c47ac-c0dd-4c98-8b47-3a449b3b9d02" containerName="registry-server" Oct 01 12:50:52 crc kubenswrapper[4913]: I1001 12:50:52.892614 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:50:52 crc kubenswrapper[4913]: I1001 12:50:52.917862 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rv85"] Oct 01 12:50:53 crc kubenswrapper[4913]: I1001 12:50:53.066747 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf71fee-abbb-4128-a5d3-1d57d0933374-utilities\") pod \"certified-operators-8rv85\" (UID: \"0bf71fee-abbb-4128-a5d3-1d57d0933374\") " pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:50:53 crc kubenswrapper[4913]: I1001 12:50:53.066791 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf71fee-abbb-4128-a5d3-1d57d0933374-catalog-content\") pod \"certified-operators-8rv85\" (UID: \"0bf71fee-abbb-4128-a5d3-1d57d0933374\") " pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:50:53 crc kubenswrapper[4913]: I1001 12:50:53.066854 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnv4p\" (UniqueName: \"kubernetes.io/projected/0bf71fee-abbb-4128-a5d3-1d57d0933374-kube-api-access-nnv4p\") pod \"certified-operators-8rv85\" (UID: \"0bf71fee-abbb-4128-a5d3-1d57d0933374\") " pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:50:53 crc kubenswrapper[4913]: I1001 12:50:53.167666 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnv4p\" (UniqueName: \"kubernetes.io/projected/0bf71fee-abbb-4128-a5d3-1d57d0933374-kube-api-access-nnv4p\") pod \"certified-operators-8rv85\" (UID: \"0bf71fee-abbb-4128-a5d3-1d57d0933374\") " pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:50:53 crc kubenswrapper[4913]: I1001 12:50:53.167752 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf71fee-abbb-4128-a5d3-1d57d0933374-utilities\") pod \"certified-operators-8rv85\" (UID: \"0bf71fee-abbb-4128-a5d3-1d57d0933374\") " pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:50:53 crc kubenswrapper[4913]: I1001 12:50:53.167771 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf71fee-abbb-4128-a5d3-1d57d0933374-catalog-content\") pod \"certified-operators-8rv85\" (UID: \"0bf71fee-abbb-4128-a5d3-1d57d0933374\") " pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:50:53 crc kubenswrapper[4913]: I1001 12:50:53.168298 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf71fee-abbb-4128-a5d3-1d57d0933374-catalog-content\") pod \"certified-operators-8rv85\" (UID: \"0bf71fee-abbb-4128-a5d3-1d57d0933374\") " pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:50:53 crc kubenswrapper[4913]: I1001 12:50:53.168391 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf71fee-abbb-4128-a5d3-1d57d0933374-utilities\") pod \"certified-operators-8rv85\" (UID: \"0bf71fee-abbb-4128-a5d3-1d57d0933374\") " pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:50:53 crc kubenswrapper[4913]: I1001 12:50:53.189210 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnv4p\" (UniqueName: \"kubernetes.io/projected/0bf71fee-abbb-4128-a5d3-1d57d0933374-kube-api-access-nnv4p\") pod \"certified-operators-8rv85\" (UID: \"0bf71fee-abbb-4128-a5d3-1d57d0933374\") " pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:50:53 crc kubenswrapper[4913]: I1001 12:50:53.211285 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:50:53 crc kubenswrapper[4913]: I1001 12:50:53.713354 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rv85"] Oct 01 12:50:53 crc kubenswrapper[4913]: W1001 12:50:53.722043 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bf71fee_abbb_4128_a5d3_1d57d0933374.slice/crio-c49875d52a6c3654b67ee919967ef98b38d8ca2bf63bc6e26ec3a7e57cc6a7d8 WatchSource:0}: Error finding container c49875d52a6c3654b67ee919967ef98b38d8ca2bf63bc6e26ec3a7e57cc6a7d8: Status 404 returned error can't find the container with id c49875d52a6c3654b67ee919967ef98b38d8ca2bf63bc6e26ec3a7e57cc6a7d8 Oct 01 12:50:54 crc kubenswrapper[4913]: I1001 12:50:54.472693 4913 generic.go:334] "Generic (PLEG): container finished" podID="0bf71fee-abbb-4128-a5d3-1d57d0933374" containerID="35b6808ded3a75f0a14804f514dfc68a2668ab2513f8c14da6d19f50b25d5b3b" exitCode=0 Oct 01 12:50:54 crc kubenswrapper[4913]: I1001 12:50:54.472735 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rv85" event={"ID":"0bf71fee-abbb-4128-a5d3-1d57d0933374","Type":"ContainerDied","Data":"35b6808ded3a75f0a14804f514dfc68a2668ab2513f8c14da6d19f50b25d5b3b"} Oct 01 12:50:54 crc kubenswrapper[4913]: I1001 12:50:54.472760 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rv85" event={"ID":"0bf71fee-abbb-4128-a5d3-1d57d0933374","Type":"ContainerStarted","Data":"c49875d52a6c3654b67ee919967ef98b38d8ca2bf63bc6e26ec3a7e57cc6a7d8"} Oct 01 12:50:55 crc kubenswrapper[4913]: I1001 12:50:55.484246 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rv85" event={"ID":"0bf71fee-abbb-4128-a5d3-1d57d0933374","Type":"ContainerStarted","Data":"be3e025bcc328e9f1265019ef5df83d23c096cd41a61ae5214c2b142b25f2fc8"} Oct 01 12:50:56 crc kubenswrapper[4913]: I1001 12:50:56.496054 4913 generic.go:334] "Generic (PLEG): container finished" podID="0bf71fee-abbb-4128-a5d3-1d57d0933374" containerID="be3e025bcc328e9f1265019ef5df83d23c096cd41a61ae5214c2b142b25f2fc8" exitCode=0 Oct 01 12:50:56 crc kubenswrapper[4913]: I1001 12:50:56.496155 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rv85" event={"ID":"0bf71fee-abbb-4128-a5d3-1d57d0933374","Type":"ContainerDied","Data":"be3e025bcc328e9f1265019ef5df83d23c096cd41a61ae5214c2b142b25f2fc8"} Oct 01 12:50:57 crc kubenswrapper[4913]: I1001 12:50:57.504795 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rv85" event={"ID":"0bf71fee-abbb-4128-a5d3-1d57d0933374","Type":"ContainerStarted","Data":"b7a0f45b78b53275c37aa5752e2926a43f34ea4c0dedfe1ebb2724f134e96506"} Oct 01 12:50:57 crc kubenswrapper[4913]: I1001 12:50:57.533113 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8rv85" podStartSLOduration=3.057697666 podStartE2EDuration="5.533100067s" podCreationTimestamp="2025-10-01 12:50:52 +0000 UTC" firstStartedPulling="2025-10-01 12:50:54.474548176 +0000 UTC m=+786.378023754" lastFinishedPulling="2025-10-01 12:50:56.949950537 +0000 UTC m=+788.853426155" observedRunningTime="2025-10-01 12:50:57.530629898 +0000 UTC m=+789.434105516" watchObservedRunningTime="2025-10-01 12:50:57.533100067 +0000 UTC m=+789.436575645" Oct 01 12:51:03 crc kubenswrapper[4913]: I1001 12:51:03.212060 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:51:03 crc kubenswrapper[4913]: I1001 12:51:03.212626 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:51:03 crc kubenswrapper[4913]: I1001 12:51:03.284824 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:51:03 crc kubenswrapper[4913]: I1001 12:51:03.607207 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:51:03 crc kubenswrapper[4913]: I1001 12:51:03.652595 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rv85"] Oct 01 12:51:05 crc kubenswrapper[4913]: I1001 12:51:05.562478 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8rv85" podUID="0bf71fee-abbb-4128-a5d3-1d57d0933374" containerName="registry-server" containerID="cri-o://b7a0f45b78b53275c37aa5752e2926a43f34ea4c0dedfe1ebb2724f134e96506" gracePeriod=2 Oct 01 12:51:06 crc kubenswrapper[4913]: I1001 12:51:06.570774 4913 generic.go:334] "Generic (PLEG): container finished" podID="0bf71fee-abbb-4128-a5d3-1d57d0933374" containerID="b7a0f45b78b53275c37aa5752e2926a43f34ea4c0dedfe1ebb2724f134e96506" exitCode=0 Oct 01 12:51:06 crc kubenswrapper[4913]: I1001 12:51:06.570840 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rv85" event={"ID":"0bf71fee-abbb-4128-a5d3-1d57d0933374","Type":"ContainerDied","Data":"b7a0f45b78b53275c37aa5752e2926a43f34ea4c0dedfe1ebb2724f134e96506"} Oct 01 12:51:06 crc kubenswrapper[4913]: I1001 12:51:06.571099 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rv85" event={"ID":"0bf71fee-abbb-4128-a5d3-1d57d0933374","Type":"ContainerDied","Data":"c49875d52a6c3654b67ee919967ef98b38d8ca2bf63bc6e26ec3a7e57cc6a7d8"} Oct 01 12:51:06 crc kubenswrapper[4913]: I1001 12:51:06.571114 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c49875d52a6c3654b67ee919967ef98b38d8ca2bf63bc6e26ec3a7e57cc6a7d8" Oct 01 12:51:06 crc kubenswrapper[4913]: I1001 12:51:06.593500 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:51:06 crc kubenswrapper[4913]: I1001 12:51:06.760740 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf71fee-abbb-4128-a5d3-1d57d0933374-catalog-content\") pod \"0bf71fee-abbb-4128-a5d3-1d57d0933374\" (UID: \"0bf71fee-abbb-4128-a5d3-1d57d0933374\") " Oct 01 12:51:06 crc kubenswrapper[4913]: I1001 12:51:06.761051 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnv4p\" (UniqueName: \"kubernetes.io/projected/0bf71fee-abbb-4128-a5d3-1d57d0933374-kube-api-access-nnv4p\") pod \"0bf71fee-abbb-4128-a5d3-1d57d0933374\" (UID: \"0bf71fee-abbb-4128-a5d3-1d57d0933374\") " Oct 01 12:51:06 crc kubenswrapper[4913]: I1001 12:51:06.761132 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf71fee-abbb-4128-a5d3-1d57d0933374-utilities\") pod \"0bf71fee-abbb-4128-a5d3-1d57d0933374\" (UID: \"0bf71fee-abbb-4128-a5d3-1d57d0933374\") " Oct 01 12:51:06 crc kubenswrapper[4913]: I1001 12:51:06.762460 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf71fee-abbb-4128-a5d3-1d57d0933374-utilities" (OuterVolumeSpecName: "utilities") pod "0bf71fee-abbb-4128-a5d3-1d57d0933374" (UID: "0bf71fee-abbb-4128-a5d3-1d57d0933374"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:51:06 crc kubenswrapper[4913]: I1001 12:51:06.769328 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bf71fee-abbb-4128-a5d3-1d57d0933374-kube-api-access-nnv4p" (OuterVolumeSpecName: "kube-api-access-nnv4p") pod "0bf71fee-abbb-4128-a5d3-1d57d0933374" (UID: "0bf71fee-abbb-4128-a5d3-1d57d0933374"). InnerVolumeSpecName "kube-api-access-nnv4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:51:06 crc kubenswrapper[4913]: I1001 12:51:06.824599 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf71fee-abbb-4128-a5d3-1d57d0933374-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bf71fee-abbb-4128-a5d3-1d57d0933374" (UID: "0bf71fee-abbb-4128-a5d3-1d57d0933374"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:51:06 crc kubenswrapper[4913]: I1001 12:51:06.862987 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnv4p\" (UniqueName: \"kubernetes.io/projected/0bf71fee-abbb-4128-a5d3-1d57d0933374-kube-api-access-nnv4p\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:06 crc kubenswrapper[4913]: I1001 12:51:06.863032 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf71fee-abbb-4128-a5d3-1d57d0933374-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:06 crc kubenswrapper[4913]: I1001 12:51:06.863049 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf71fee-abbb-4128-a5d3-1d57d0933374-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:07 crc kubenswrapper[4913]: I1001 12:51:07.578828 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rv85" Oct 01 12:51:07 crc kubenswrapper[4913]: I1001 12:51:07.634517 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rv85"] Oct 01 12:51:07 crc kubenswrapper[4913]: I1001 12:51:07.644055 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8rv85"] Oct 01 12:51:08 crc kubenswrapper[4913]: I1001 12:51:08.817851 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bf71fee-abbb-4128-a5d3-1d57d0933374" path="/var/lib/kubelet/pods/0bf71fee-abbb-4128-a5d3-1d57d0933374/volumes" Oct 01 12:51:09 crc kubenswrapper[4913]: I1001 12:51:09.942780 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v7sll"] Oct 01 12:51:09 crc kubenswrapper[4913]: E1001 12:51:09.943432 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf71fee-abbb-4128-a5d3-1d57d0933374" containerName="extract-utilities" Oct 01 12:51:09 crc kubenswrapper[4913]: I1001 12:51:09.943452 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf71fee-abbb-4128-a5d3-1d57d0933374" containerName="extract-utilities" Oct 01 12:51:09 crc kubenswrapper[4913]: E1001 12:51:09.943467 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf71fee-abbb-4128-a5d3-1d57d0933374" containerName="registry-server" Oct 01 12:51:09 crc kubenswrapper[4913]: I1001 12:51:09.943477 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf71fee-abbb-4128-a5d3-1d57d0933374" containerName="registry-server" Oct 01 12:51:09 crc kubenswrapper[4913]: E1001 12:51:09.943501 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf71fee-abbb-4128-a5d3-1d57d0933374" containerName="extract-content" Oct 01 12:51:09 crc kubenswrapper[4913]: I1001 12:51:09.943512 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf71fee-abbb-4128-a5d3-1d57d0933374" containerName="extract-content" Oct 01 12:51:09 crc kubenswrapper[4913]: I1001 12:51:09.943717 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf71fee-abbb-4128-a5d3-1d57d0933374" containerName="registry-server" Oct 01 12:51:09 crc kubenswrapper[4913]: I1001 12:51:09.945344 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:09 crc kubenswrapper[4913]: I1001 12:51:09.971102 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v7sll"] Oct 01 12:51:10 crc kubenswrapper[4913]: I1001 12:51:10.008425 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hzjx\" (UniqueName: \"kubernetes.io/projected/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-kube-api-access-8hzjx\") pod \"redhat-operators-v7sll\" (UID: \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\") " pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:10 crc kubenswrapper[4913]: I1001 12:51:10.008509 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-utilities\") pod \"redhat-operators-v7sll\" (UID: \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\") " pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:10 crc kubenswrapper[4913]: I1001 12:51:10.008581 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-catalog-content\") pod \"redhat-operators-v7sll\" (UID: \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\") " pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:10 crc kubenswrapper[4913]: I1001 12:51:10.110577 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hzjx\" (UniqueName: \"kubernetes.io/projected/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-kube-api-access-8hzjx\") pod \"redhat-operators-v7sll\" (UID: \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\") " pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:10 crc kubenswrapper[4913]: I1001 12:51:10.110636 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-utilities\") pod \"redhat-operators-v7sll\" (UID: \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\") " pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:10 crc kubenswrapper[4913]: I1001 12:51:10.110679 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-catalog-content\") pod \"redhat-operators-v7sll\" (UID: \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\") " pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:10 crc kubenswrapper[4913]: I1001 12:51:10.111288 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-catalog-content\") pod \"redhat-operators-v7sll\" (UID: \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\") " pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:10 crc kubenswrapper[4913]: I1001 12:51:10.111299 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-utilities\") pod \"redhat-operators-v7sll\" (UID: \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\") " pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:10 crc kubenswrapper[4913]: I1001 12:51:10.133913 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hzjx\" (UniqueName: \"kubernetes.io/projected/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-kube-api-access-8hzjx\") pod \"redhat-operators-v7sll\" (UID: \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\") " pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:10 crc kubenswrapper[4913]: I1001 12:51:10.274007 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:10 crc kubenswrapper[4913]: I1001 12:51:10.713550 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v7sll"] Oct 01 12:51:11 crc kubenswrapper[4913]: I1001 12:51:11.608570 4913 generic.go:334] "Generic (PLEG): container finished" podID="dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" containerID="6cce1bbc3c0bd65623204a56b6a2f9f57076a41c58229cfbc7804dff5fca442d" exitCode=0 Oct 01 12:51:11 crc kubenswrapper[4913]: I1001 12:51:11.608631 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7sll" event={"ID":"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631","Type":"ContainerDied","Data":"6cce1bbc3c0bd65623204a56b6a2f9f57076a41c58229cfbc7804dff5fca442d"} Oct 01 12:51:11 crc kubenswrapper[4913]: I1001 12:51:11.608861 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7sll" event={"ID":"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631","Type":"ContainerStarted","Data":"d7e1f13763df11c97a8b331dc6b02eba9adc0ebeaff6ada265ade35744408fb2"} Oct 01 12:51:13 crc kubenswrapper[4913]: I1001 12:51:13.627158 4913 generic.go:334] "Generic (PLEG): container finished" podID="dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" containerID="0f426d5bda3b52df02a57e7924d7d339486ba2f83225b44f8750dfbb51db68bc" exitCode=0 Oct 01 12:51:13 crc kubenswrapper[4913]: I1001 12:51:13.627218 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7sll" event={"ID":"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631","Type":"ContainerDied","Data":"0f426d5bda3b52df02a57e7924d7d339486ba2f83225b44f8750dfbb51db68bc"} Oct 01 12:51:14 crc kubenswrapper[4913]: I1001 12:51:14.637556 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7sll" event={"ID":"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631","Type":"ContainerStarted","Data":"59843346409af6726b6752d8b378be9814e0ee6bb4ec7d8ffec6e65e37a82499"} Oct 01 12:51:20 crc kubenswrapper[4913]: I1001 12:51:20.275245 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:20 crc kubenswrapper[4913]: I1001 12:51:20.275672 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:20 crc kubenswrapper[4913]: I1001 12:51:20.332307 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:20 crc kubenswrapper[4913]: I1001 12:51:20.363230 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v7sll" podStartSLOduration=8.939343964999999 podStartE2EDuration="11.363203844s" podCreationTimestamp="2025-10-01 12:51:09 +0000 UTC" firstStartedPulling="2025-10-01 12:51:11.610362043 +0000 UTC m=+803.513837621" lastFinishedPulling="2025-10-01 12:51:14.034221882 +0000 UTC m=+805.937697500" observedRunningTime="2025-10-01 12:51:14.663728431 +0000 UTC m=+806.567204039" watchObservedRunningTime="2025-10-01 12:51:20.363203844 +0000 UTC m=+812.266679452" Oct 01 12:51:20 crc kubenswrapper[4913]: I1001 12:51:20.708685 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:20 crc kubenswrapper[4913]: I1001 12:51:20.758430 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v7sll"] Oct 01 12:51:22 crc kubenswrapper[4913]: I1001 12:51:22.680778 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v7sll" podUID="dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" containerName="registry-server" containerID="cri-o://59843346409af6726b6752d8b378be9814e0ee6bb4ec7d8ffec6e65e37a82499" gracePeriod=2 Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.066327 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.098564 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hzjx\" (UniqueName: \"kubernetes.io/projected/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-kube-api-access-8hzjx\") pod \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\" (UID: \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\") " Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.098625 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-catalog-content\") pod \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\" (UID: \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\") " Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.098659 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-utilities\") pod \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\" (UID: \"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631\") " Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.099576 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-utilities" (OuterVolumeSpecName: "utilities") pod "dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" (UID: "dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.112537 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-kube-api-access-8hzjx" (OuterVolumeSpecName: "kube-api-access-8hzjx") pod "dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" (UID: "dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631"). InnerVolumeSpecName "kube-api-access-8hzjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.200912 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hzjx\" (UniqueName: \"kubernetes.io/projected/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-kube-api-access-8hzjx\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.200952 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.239475 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" (UID: "dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.302014 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.688302 4913 generic.go:334] "Generic (PLEG): container finished" podID="dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" containerID="59843346409af6726b6752d8b378be9814e0ee6bb4ec7d8ffec6e65e37a82499" exitCode=0 Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.688341 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7sll" event={"ID":"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631","Type":"ContainerDied","Data":"59843346409af6726b6752d8b378be9814e0ee6bb4ec7d8ffec6e65e37a82499"} Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.688351 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v7sll" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.688370 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7sll" event={"ID":"dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631","Type":"ContainerDied","Data":"d7e1f13763df11c97a8b331dc6b02eba9adc0ebeaff6ada265ade35744408fb2"} Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.688386 4913 scope.go:117] "RemoveContainer" containerID="59843346409af6726b6752d8b378be9814e0ee6bb4ec7d8ffec6e65e37a82499" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.701156 4913 scope.go:117] "RemoveContainer" containerID="0f426d5bda3b52df02a57e7924d7d339486ba2f83225b44f8750dfbb51db68bc" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.714837 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v7sll"] Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.718128 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v7sll"] Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.742359 4913 scope.go:117] "RemoveContainer" containerID="6cce1bbc3c0bd65623204a56b6a2f9f57076a41c58229cfbc7804dff5fca442d" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.759163 4913 scope.go:117] "RemoveContainer" containerID="59843346409af6726b6752d8b378be9814e0ee6bb4ec7d8ffec6e65e37a82499" Oct 01 12:51:23 crc kubenswrapper[4913]: E1001 12:51:23.760764 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59843346409af6726b6752d8b378be9814e0ee6bb4ec7d8ffec6e65e37a82499\": container with ID starting with 59843346409af6726b6752d8b378be9814e0ee6bb4ec7d8ffec6e65e37a82499 not found: ID does not exist" containerID="59843346409af6726b6752d8b378be9814e0ee6bb4ec7d8ffec6e65e37a82499" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.760818 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59843346409af6726b6752d8b378be9814e0ee6bb4ec7d8ffec6e65e37a82499"} err="failed to get container status \"59843346409af6726b6752d8b378be9814e0ee6bb4ec7d8ffec6e65e37a82499\": rpc error: code = NotFound desc = could not find container \"59843346409af6726b6752d8b378be9814e0ee6bb4ec7d8ffec6e65e37a82499\": container with ID starting with 59843346409af6726b6752d8b378be9814e0ee6bb4ec7d8ffec6e65e37a82499 not found: ID does not exist" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.760905 4913 scope.go:117] "RemoveContainer" containerID="0f426d5bda3b52df02a57e7924d7d339486ba2f83225b44f8750dfbb51db68bc" Oct 01 12:51:23 crc kubenswrapper[4913]: E1001 12:51:23.761289 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f426d5bda3b52df02a57e7924d7d339486ba2f83225b44f8750dfbb51db68bc\": container with ID starting with 0f426d5bda3b52df02a57e7924d7d339486ba2f83225b44f8750dfbb51db68bc not found: ID does not exist" containerID="0f426d5bda3b52df02a57e7924d7d339486ba2f83225b44f8750dfbb51db68bc" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.761334 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f426d5bda3b52df02a57e7924d7d339486ba2f83225b44f8750dfbb51db68bc"} err="failed to get container status \"0f426d5bda3b52df02a57e7924d7d339486ba2f83225b44f8750dfbb51db68bc\": rpc error: code = NotFound desc = could not find container \"0f426d5bda3b52df02a57e7924d7d339486ba2f83225b44f8750dfbb51db68bc\": container with ID starting with 0f426d5bda3b52df02a57e7924d7d339486ba2f83225b44f8750dfbb51db68bc not found: ID does not exist" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.761364 4913 scope.go:117] "RemoveContainer" containerID="6cce1bbc3c0bd65623204a56b6a2f9f57076a41c58229cfbc7804dff5fca442d" Oct 01 12:51:23 crc kubenswrapper[4913]: E1001 12:51:23.761659 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cce1bbc3c0bd65623204a56b6a2f9f57076a41c58229cfbc7804dff5fca442d\": container with ID starting with 6cce1bbc3c0bd65623204a56b6a2f9f57076a41c58229cfbc7804dff5fca442d not found: ID does not exist" containerID="6cce1bbc3c0bd65623204a56b6a2f9f57076a41c58229cfbc7804dff5fca442d" Oct 01 12:51:23 crc kubenswrapper[4913]: I1001 12:51:23.761698 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cce1bbc3c0bd65623204a56b6a2f9f57076a41c58229cfbc7804dff5fca442d"} err="failed to get container status \"6cce1bbc3c0bd65623204a56b6a2f9f57076a41c58229cfbc7804dff5fca442d\": rpc error: code = NotFound desc = could not find container \"6cce1bbc3c0bd65623204a56b6a2f9f57076a41c58229cfbc7804dff5fca442d\": container with ID starting with 6cce1bbc3c0bd65623204a56b6a2f9f57076a41c58229cfbc7804dff5fca442d not found: ID does not exist" Oct 01 12:51:24 crc kubenswrapper[4913]: I1001 12:51:24.817250 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" path="/var/lib/kubelet/pods/dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631/volumes" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.482057 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-p6xjl"] Oct 01 12:51:26 crc kubenswrapper[4913]: E1001 12:51:26.482479 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" containerName="extract-utilities" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.482501 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" containerName="extract-utilities" Oct 01 12:51:26 crc kubenswrapper[4913]: E1001 12:51:26.482528 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" containerName="extract-content" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.482544 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" containerName="extract-content" Oct 01 12:51:26 crc kubenswrapper[4913]: E1001 12:51:26.482562 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" containerName="registry-server" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.482574 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" containerName="registry-server" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.482838 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf417a3-b1a8-4ce0-a0f7-1f43fd19c631" containerName="registry-server" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.484012 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-p6xjl" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.486325 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-jqrkq" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.487204 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-wdw6b"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.488562 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-wdw6b" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.490529 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-468bc" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.503376 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-pbjqc"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.504249 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-pbjqc" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.510015 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-c8shh" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.511003 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-p6xjl"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.514893 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-wdw6b"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.521470 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-pbjqc"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.541873 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-gq8sp"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.542998 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-gq8sp" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.546710 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-rxgj5"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.548006 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-rxgj5" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.548770 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxp78\" (UniqueName: \"kubernetes.io/projected/9d418cfb-ec47-4ba7-b29b-5e68fddf11e4-kube-api-access-sxp78\") pod \"cinder-operator-controller-manager-859cd486d-p6xjl\" (UID: \"9d418cfb-ec47-4ba7-b29b-5e68fddf11e4\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-p6xjl" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.548842 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58kf6\" (UniqueName: \"kubernetes.io/projected/cb312f93-72ce-4d3e-9d89-66526e40dca2-kube-api-access-58kf6\") pod \"designate-operator-controller-manager-77fb7bcf5b-pbjqc\" (UID: \"cb312f93-72ce-4d3e-9d89-66526e40dca2\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-pbjqc" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.548872 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98zfd\" (UniqueName: \"kubernetes.io/projected/086685ca-996d-44d9-bd02-33cc99e5dab9-kube-api-access-98zfd\") pod \"barbican-operator-controller-manager-f7f98cb69-wdw6b\" (UID: \"086685ca-996d-44d9-bd02-33cc99e5dab9\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-wdw6b" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.551174 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-mp85h" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.551245 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-lgx2b" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.563305 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-gq8sp"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.567845 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-mh5sx"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.568988 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-mh5sx" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.597121 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4m8n6" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.628394 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-mh5sx"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.653288 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7mlk\" (UniqueName: \"kubernetes.io/projected/d7b4ec90-a547-48a1-83ac-3528c53f90f0-kube-api-access-g7mlk\") pod \"heat-operator-controller-manager-5b4fc86755-gq8sp\" (UID: \"d7b4ec90-a547-48a1-83ac-3528c53f90f0\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-gq8sp" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.653342 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjm5m\" (UniqueName: \"kubernetes.io/projected/55b36494-1091-44b0-b303-ac62e5cef841-kube-api-access-tjm5m\") pod \"glance-operator-controller-manager-8bc4775b5-rxgj5\" (UID: \"55b36494-1091-44b0-b303-ac62e5cef841\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-rxgj5" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.653381 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7smw2\" (UniqueName: \"kubernetes.io/projected/845f41f1-b619-4984-aa2a-bae46992d463-kube-api-access-7smw2\") pod \"horizon-operator-controller-manager-679b4759bb-mh5sx\" (UID: \"845f41f1-b619-4984-aa2a-bae46992d463\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-mh5sx" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.653420 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxp78\" (UniqueName: \"kubernetes.io/projected/9d418cfb-ec47-4ba7-b29b-5e68fddf11e4-kube-api-access-sxp78\") pod \"cinder-operator-controller-manager-859cd486d-p6xjl\" (UID: \"9d418cfb-ec47-4ba7-b29b-5e68fddf11e4\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-p6xjl" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.653460 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58kf6\" (UniqueName: \"kubernetes.io/projected/cb312f93-72ce-4d3e-9d89-66526e40dca2-kube-api-access-58kf6\") pod \"designate-operator-controller-manager-77fb7bcf5b-pbjqc\" (UID: \"cb312f93-72ce-4d3e-9d89-66526e40dca2\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-pbjqc" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.653484 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98zfd\" (UniqueName: \"kubernetes.io/projected/086685ca-996d-44d9-bd02-33cc99e5dab9-kube-api-access-98zfd\") pod \"barbican-operator-controller-manager-f7f98cb69-wdw6b\" (UID: \"086685ca-996d-44d9-bd02-33cc99e5dab9\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-wdw6b" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.677756 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98zfd\" (UniqueName: \"kubernetes.io/projected/086685ca-996d-44d9-bd02-33cc99e5dab9-kube-api-access-98zfd\") pod \"barbican-operator-controller-manager-f7f98cb69-wdw6b\" (UID: \"086685ca-996d-44d9-bd02-33cc99e5dab9\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-wdw6b" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.683258 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58kf6\" (UniqueName: \"kubernetes.io/projected/cb312f93-72ce-4d3e-9d89-66526e40dca2-kube-api-access-58kf6\") pod \"designate-operator-controller-manager-77fb7bcf5b-pbjqc\" (UID: \"cb312f93-72ce-4d3e-9d89-66526e40dca2\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-pbjqc" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.693016 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxp78\" (UniqueName: \"kubernetes.io/projected/9d418cfb-ec47-4ba7-b29b-5e68fddf11e4-kube-api-access-sxp78\") pod \"cinder-operator-controller-manager-859cd486d-p6xjl\" (UID: \"9d418cfb-ec47-4ba7-b29b-5e68fddf11e4\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-p6xjl" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.694620 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.695838 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.698218 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-rxgj5"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.700425 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.700541 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-zp45v" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.706326 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f45cd594f-hdrq8"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.707371 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-hdrq8" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.710824 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-lvzvm" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.717604 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.724034 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-s7zlm"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.725133 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-s7zlm" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.728381 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-slgqv" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.745633 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f45cd594f-hdrq8"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.754150 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7smw2\" (UniqueName: \"kubernetes.io/projected/845f41f1-b619-4984-aa2a-bae46992d463-kube-api-access-7smw2\") pod \"horizon-operator-controller-manager-679b4759bb-mh5sx\" (UID: \"845f41f1-b619-4984-aa2a-bae46992d463\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-mh5sx" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.754191 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl7xk\" (UniqueName: \"kubernetes.io/projected/5c79503f-3029-498b-b9b1-e2df43820cb2-kube-api-access-jl7xk\") pod \"ironic-operator-controller-manager-5f45cd594f-hdrq8\" (UID: \"5c79503f-3029-498b-b9b1-e2df43820cb2\") " pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-hdrq8" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.754216 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lz8h\" (UniqueName: \"kubernetes.io/projected/5ac2ed71-cb90-4003-b8f9-5ad6748c08d5-kube-api-access-6lz8h\") pod \"infra-operator-controller-manager-5c8fdc4d5c-2ztjf\" (UID: \"5ac2ed71-cb90-4003-b8f9-5ad6748c08d5\") " pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.754249 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv4zl\" (UniqueName: \"kubernetes.io/projected/e5ec32c8-323f-4c74-bf82-4dc2a70db41a-kube-api-access-kv4zl\") pod \"keystone-operator-controller-manager-59d7dc95cf-s7zlm\" (UID: \"e5ec32c8-323f-4c74-bf82-4dc2a70db41a\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-s7zlm" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.754296 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ac2ed71-cb90-4003-b8f9-5ad6748c08d5-cert\") pod \"infra-operator-controller-manager-5c8fdc4d5c-2ztjf\" (UID: \"5ac2ed71-cb90-4003-b8f9-5ad6748c08d5\") " pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.754321 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7mlk\" (UniqueName: \"kubernetes.io/projected/d7b4ec90-a547-48a1-83ac-3528c53f90f0-kube-api-access-g7mlk\") pod \"heat-operator-controller-manager-5b4fc86755-gq8sp\" (UID: \"d7b4ec90-a547-48a1-83ac-3528c53f90f0\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-gq8sp" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.754347 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjm5m\" (UniqueName: \"kubernetes.io/projected/55b36494-1091-44b0-b303-ac62e5cef841-kube-api-access-tjm5m\") pod \"glance-operator-controller-manager-8bc4775b5-rxgj5\" (UID: \"55b36494-1091-44b0-b303-ac62e5cef841\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-rxgj5" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.760328 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-5jvz7"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.761396 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-5jvz7" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.763926 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-wfcnz" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.794797 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-s7zlm"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.796983 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjm5m\" (UniqueName: \"kubernetes.io/projected/55b36494-1091-44b0-b303-ac62e5cef841-kube-api-access-tjm5m\") pod \"glance-operator-controller-manager-8bc4775b5-rxgj5\" (UID: \"55b36494-1091-44b0-b303-ac62e5cef841\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-rxgj5" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.802885 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7smw2\" (UniqueName: \"kubernetes.io/projected/845f41f1-b619-4984-aa2a-bae46992d463-kube-api-access-7smw2\") pod \"horizon-operator-controller-manager-679b4759bb-mh5sx\" (UID: \"845f41f1-b619-4984-aa2a-bae46992d463\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-mh5sx" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.810617 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-p6xjl" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.816378 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7mlk\" (UniqueName: \"kubernetes.io/projected/d7b4ec90-a547-48a1-83ac-3528c53f90f0-kube-api-access-g7mlk\") pod \"heat-operator-controller-manager-5b4fc86755-gq8sp\" (UID: \"d7b4ec90-a547-48a1-83ac-3528c53f90f0\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-gq8sp" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.828264 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-5jvz7"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.843234 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.844310 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.847062 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-q6v7g" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.852495 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nsm78"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.853514 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nsm78" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.855632 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl7xk\" (UniqueName: \"kubernetes.io/projected/5c79503f-3029-498b-b9b1-e2df43820cb2-kube-api-access-jl7xk\") pod \"ironic-operator-controller-manager-5f45cd594f-hdrq8\" (UID: \"5c79503f-3029-498b-b9b1-e2df43820cb2\") " pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-hdrq8" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.855671 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lz8h\" (UniqueName: \"kubernetes.io/projected/5ac2ed71-cb90-4003-b8f9-5ad6748c08d5-kube-api-access-6lz8h\") pod \"infra-operator-controller-manager-5c8fdc4d5c-2ztjf\" (UID: \"5ac2ed71-cb90-4003-b8f9-5ad6748c08d5\") " pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.855707 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv4zl\" (UniqueName: \"kubernetes.io/projected/e5ec32c8-323f-4c74-bf82-4dc2a70db41a-kube-api-access-kv4zl\") pod \"keystone-operator-controller-manager-59d7dc95cf-s7zlm\" (UID: \"e5ec32c8-323f-4c74-bf82-4dc2a70db41a\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-s7zlm" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.855734 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k2jk\" (UniqueName: \"kubernetes.io/projected/13e74825-5720-4c49-97e9-a0fccf649b50-kube-api-access-5k2jk\") pod \"manila-operator-controller-manager-b7cf8cb5f-5jvz7\" (UID: \"13e74825-5720-4c49-97e9-a0fccf649b50\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-5jvz7" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.855763 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ac2ed71-cb90-4003-b8f9-5ad6748c08d5-cert\") pod \"infra-operator-controller-manager-5c8fdc4d5c-2ztjf\" (UID: \"5ac2ed71-cb90-4003-b8f9-5ad6748c08d5\") " pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.855793 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9r42\" (UniqueName: \"kubernetes.io/projected/e80ad896-dd86-43e6-850b-f12088a61cf5-kube-api-access-l9r42\") pod \"neutron-operator-controller-manager-54fbbfcd44-f8tw7\" (UID: \"e80ad896-dd86-43e6-850b-f12088a61cf5\") " pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.858984 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-wdw6b" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.860307 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nsm78"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.867499 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-rp5bf" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.874935 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-pbjqc" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.878076 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.878757 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ac2ed71-cb90-4003-b8f9-5ad6748c08d5-cert\") pod \"infra-operator-controller-manager-5c8fdc4d5c-2ztjf\" (UID: \"5ac2ed71-cb90-4003-b8f9-5ad6748c08d5\") " pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.894456 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lz8h\" (UniqueName: \"kubernetes.io/projected/5ac2ed71-cb90-4003-b8f9-5ad6748c08d5-kube-api-access-6lz8h\") pod \"infra-operator-controller-manager-5c8fdc4d5c-2ztjf\" (UID: \"5ac2ed71-cb90-4003-b8f9-5ad6748c08d5\") " pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.900984 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv4zl\" (UniqueName: \"kubernetes.io/projected/e5ec32c8-323f-4c74-bf82-4dc2a70db41a-kube-api-access-kv4zl\") pod \"keystone-operator-controller-manager-59d7dc95cf-s7zlm\" (UID: \"e5ec32c8-323f-4c74-bf82-4dc2a70db41a\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-s7zlm" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.903555 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-gq8sp" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.904941 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl7xk\" (UniqueName: \"kubernetes.io/projected/5c79503f-3029-498b-b9b1-e2df43820cb2-kube-api-access-jl7xk\") pod \"ironic-operator-controller-manager-5f45cd594f-hdrq8\" (UID: \"5c79503f-3029-498b-b9b1-e2df43820cb2\") " pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-hdrq8" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.906948 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.908722 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.910510 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-rm2fq" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.915326 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-75f8d67d86-pmhs8"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.916406 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-pmhs8" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.921110 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.921560 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-5ffb6" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.930724 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-rxgj5" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.936075 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-f6wpw"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.937126 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-f6wpw" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.940802 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-9nxns" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.952547 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-75f8d67d86-pmhs8"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.959799 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-f6wpw"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.961171 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpc4q\" (UniqueName: \"kubernetes.io/projected/8f9b91c6-b4e9-44b3-83c2-42412d48de96-kube-api-access-xpc4q\") pod \"mariadb-operator-controller-manager-67bf5bb885-nsm78\" (UID: \"8f9b91c6-b4e9-44b3-83c2-42412d48de96\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nsm78" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.961200 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k2jk\" (UniqueName: \"kubernetes.io/projected/13e74825-5720-4c49-97e9-a0fccf649b50-kube-api-access-5k2jk\") pod \"manila-operator-controller-manager-b7cf8cb5f-5jvz7\" (UID: \"13e74825-5720-4c49-97e9-a0fccf649b50\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-5jvz7" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.961224 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pww9r\" (UniqueName: \"kubernetes.io/projected/b7681632-31dc-4278-889a-8b89ddccac74-kube-api-access-pww9r\") pod \"octavia-operator-controller-manager-75f8d67d86-pmhs8\" (UID: \"b7681632-31dc-4278-889a-8b89ddccac74\") " pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-pmhs8" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.961311 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9r42\" (UniqueName: \"kubernetes.io/projected/e80ad896-dd86-43e6-850b-f12088a61cf5-kube-api-access-l9r42\") pod \"neutron-operator-controller-manager-54fbbfcd44-f8tw7\" (UID: \"e80ad896-dd86-43e6-850b-f12088a61cf5\") " pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.961336 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtrmc\" (UniqueName: \"kubernetes.io/projected/f28004e7-4e00-4ffc-ae3a-cfad4022387a-kube-api-access-wtrmc\") pod \"ovn-operator-controller-manager-84c745747f-f6wpw\" (UID: \"f28004e7-4e00-4ffc-ae3a-cfad4022387a\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-f6wpw" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.961361 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdxzl\" (UniqueName: \"kubernetes.io/projected/a34a6147-f982-48cf-9976-b39c4dd420cf-kube-api-access-rdxzl\") pod \"nova-operator-controller-manager-7fd5b6bbc6-45w5m\" (UID: \"a34a6147-f982-48cf-9976-b39c4dd420cf\") " pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.966990 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.967993 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.972916 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bphsz" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.973348 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.977734 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-pvvml"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.979202 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-pvvml" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.980070 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k2jk\" (UniqueName: \"kubernetes.io/projected/13e74825-5720-4c49-97e9-a0fccf649b50-kube-api-access-5k2jk\") pod \"manila-operator-controller-manager-b7cf8cb5f-5jvz7\" (UID: \"13e74825-5720-4c49-97e9-a0fccf649b50\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-5jvz7" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.981438 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-jnndf" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.981912 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-mh5sx" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.985120 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d"] Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.986216 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.987887 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-7dc8h" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.988297 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9r42\" (UniqueName: \"kubernetes.io/projected/e80ad896-dd86-43e6-850b-f12088a61cf5-kube-api-access-l9r42\") pod \"neutron-operator-controller-manager-54fbbfcd44-f8tw7\" (UID: \"e80ad896-dd86-43e6-850b-f12088a61cf5\") " pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7" Oct 01 12:51:26 crc kubenswrapper[4913]: I1001 12:51:26.993279 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.002215 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.006207 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-pvvml"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.020053 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-2djx2"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.021602 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-2djx2" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.028385 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-d265k" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.044402 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.055964 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-hdrq8" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.068715 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-2djx2"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.077941 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-s7zlm" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.078421 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtrmc\" (UniqueName: \"kubernetes.io/projected/f28004e7-4e00-4ffc-ae3a-cfad4022387a-kube-api-access-wtrmc\") pod \"ovn-operator-controller-manager-84c745747f-f6wpw\" (UID: \"f28004e7-4e00-4ffc-ae3a-cfad4022387a\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-f6wpw" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.078493 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f17e8a8-3251-4a03-8f8d-70698d5146b3-cert\") pod \"openstack-baremetal-operator-controller-manager-659bb84579hvlsl\" (UID: \"3f17e8a8-3251-4a03-8f8d-70698d5146b3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.078535 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r824c\" (UniqueName: \"kubernetes.io/projected/3f17e8a8-3251-4a03-8f8d-70698d5146b3-kube-api-access-r824c\") pod \"openstack-baremetal-operator-controller-manager-659bb84579hvlsl\" (UID: \"3f17e8a8-3251-4a03-8f8d-70698d5146b3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.078559 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdxzl\" (UniqueName: \"kubernetes.io/projected/a34a6147-f982-48cf-9976-b39c4dd420cf-kube-api-access-rdxzl\") pod \"nova-operator-controller-manager-7fd5b6bbc6-45w5m\" (UID: \"a34a6147-f982-48cf-9976-b39c4dd420cf\") " pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.078852 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpcss\" (UniqueName: \"kubernetes.io/projected/61fd19a3-0b06-401a-9bf3-9d2a34bbf291-kube-api-access-wpcss\") pod \"placement-operator-controller-manager-598c4c8547-pvvml\" (UID: \"61fd19a3-0b06-401a-9bf3-9d2a34bbf291\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-pvvml" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.078887 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr6v2\" (UniqueName: \"kubernetes.io/projected/1549da87-3151-467a-92d6-de0709a3a6a7-kube-api-access-zr6v2\") pod \"telemetry-operator-controller-manager-cb66d6b59-2djx2\" (UID: \"1549da87-3151-467a-92d6-de0709a3a6a7\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-2djx2" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.078918 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swjd8\" (UniqueName: \"kubernetes.io/projected/a19e1891-5907-42dc-9894-6c9b3bcb5cce-kube-api-access-swjd8\") pod \"swift-operator-controller-manager-689b4f76c9-8668d\" (UID: \"a19e1891-5907-42dc-9894-6c9b3bcb5cce\") " pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.078998 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpc4q\" (UniqueName: \"kubernetes.io/projected/8f9b91c6-b4e9-44b3-83c2-42412d48de96-kube-api-access-xpc4q\") pod \"mariadb-operator-controller-manager-67bf5bb885-nsm78\" (UID: \"8f9b91c6-b4e9-44b3-83c2-42412d48de96\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nsm78" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.079031 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pww9r\" (UniqueName: \"kubernetes.io/projected/b7681632-31dc-4278-889a-8b89ddccac74-kube-api-access-pww9r\") pod \"octavia-operator-controller-manager-75f8d67d86-pmhs8\" (UID: \"b7681632-31dc-4278-889a-8b89ddccac74\") " pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-pmhs8" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.105085 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpc4q\" (UniqueName: \"kubernetes.io/projected/8f9b91c6-b4e9-44b3-83c2-42412d48de96-kube-api-access-xpc4q\") pod \"mariadb-operator-controller-manager-67bf5bb885-nsm78\" (UID: \"8f9b91c6-b4e9-44b3-83c2-42412d48de96\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nsm78" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.105858 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdxzl\" (UniqueName: \"kubernetes.io/projected/a34a6147-f982-48cf-9976-b39c4dd420cf-kube-api-access-rdxzl\") pod \"nova-operator-controller-manager-7fd5b6bbc6-45w5m\" (UID: \"a34a6147-f982-48cf-9976-b39c4dd420cf\") " pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.105898 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pww9r\" (UniqueName: \"kubernetes.io/projected/b7681632-31dc-4278-889a-8b89ddccac74-kube-api-access-pww9r\") pod \"octavia-operator-controller-manager-75f8d67d86-pmhs8\" (UID: \"b7681632-31dc-4278-889a-8b89ddccac74\") " pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-pmhs8" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.106191 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtrmc\" (UniqueName: \"kubernetes.io/projected/f28004e7-4e00-4ffc-ae3a-cfad4022387a-kube-api-access-wtrmc\") pod \"ovn-operator-controller-manager-84c745747f-f6wpw\" (UID: \"f28004e7-4e00-4ffc-ae3a-cfad4022387a\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-f6wpw" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.120446 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.122205 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.124699 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-q92d7" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.136187 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.189224 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r824c\" (UniqueName: \"kubernetes.io/projected/3f17e8a8-3251-4a03-8f8d-70698d5146b3-kube-api-access-r824c\") pod \"openstack-baremetal-operator-controller-manager-659bb84579hvlsl\" (UID: \"3f17e8a8-3251-4a03-8f8d-70698d5146b3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.189331 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpcss\" (UniqueName: \"kubernetes.io/projected/61fd19a3-0b06-401a-9bf3-9d2a34bbf291-kube-api-access-wpcss\") pod \"placement-operator-controller-manager-598c4c8547-pvvml\" (UID: \"61fd19a3-0b06-401a-9bf3-9d2a34bbf291\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-pvvml" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.189361 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr6v2\" (UniqueName: \"kubernetes.io/projected/1549da87-3151-467a-92d6-de0709a3a6a7-kube-api-access-zr6v2\") pod \"telemetry-operator-controller-manager-cb66d6b59-2djx2\" (UID: \"1549da87-3151-467a-92d6-de0709a3a6a7\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-2djx2" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.189389 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swjd8\" (UniqueName: \"kubernetes.io/projected/a19e1891-5907-42dc-9894-6c9b3bcb5cce-kube-api-access-swjd8\") pod \"swift-operator-controller-manager-689b4f76c9-8668d\" (UID: \"a19e1891-5907-42dc-9894-6c9b3bcb5cce\") " pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.189460 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngkv6\" (UniqueName: \"kubernetes.io/projected/f7389885-bb97-4d1d-8c55-9d3f76eda8ed-kube-api-access-ngkv6\") pod \"test-operator-controller-manager-cbdf6dc66-r9twl\" (UID: \"f7389885-bb97-4d1d-8c55-9d3f76eda8ed\") " pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.189490 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f17e8a8-3251-4a03-8f8d-70698d5146b3-cert\") pod \"openstack-baremetal-operator-controller-manager-659bb84579hvlsl\" (UID: \"3f17e8a8-3251-4a03-8f8d-70698d5146b3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" Oct 01 12:51:27 crc kubenswrapper[4913]: E1001 12:51:27.189638 4913 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 12:51:27 crc kubenswrapper[4913]: E1001 12:51:27.189693 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f17e8a8-3251-4a03-8f8d-70698d5146b3-cert podName:3f17e8a8-3251-4a03-8f8d-70698d5146b3 nodeName:}" failed. No retries permitted until 2025-10-01 12:51:27.68967401 +0000 UTC m=+819.593149588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f17e8a8-3251-4a03-8f8d-70698d5146b3-cert") pod "openstack-baremetal-operator-controller-manager-659bb84579hvlsl" (UID: "3f17e8a8-3251-4a03-8f8d-70698d5146b3") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.198746 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-5jvz7" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.213468 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.213562 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.214639 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.217810 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-7vhvb" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.218572 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nsm78" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.226002 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swjd8\" (UniqueName: \"kubernetes.io/projected/a19e1891-5907-42dc-9894-6c9b3bcb5cce-kube-api-access-swjd8\") pod \"swift-operator-controller-manager-689b4f76c9-8668d\" (UID: \"a19e1891-5907-42dc-9894-6c9b3bcb5cce\") " pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.229397 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.230216 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpcss\" (UniqueName: \"kubernetes.io/projected/61fd19a3-0b06-401a-9bf3-9d2a34bbf291-kube-api-access-wpcss\") pod \"placement-operator-controller-manager-598c4c8547-pvvml\" (UID: \"61fd19a3-0b06-401a-9bf3-9d2a34bbf291\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-pvvml" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.234884 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr6v2\" (UniqueName: \"kubernetes.io/projected/1549da87-3151-467a-92d6-de0709a3a6a7-kube-api-access-zr6v2\") pod \"telemetry-operator-controller-manager-cb66d6b59-2djx2\" (UID: \"1549da87-3151-467a-92d6-de0709a3a6a7\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-2djx2" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.237463 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r824c\" (UniqueName: \"kubernetes.io/projected/3f17e8a8-3251-4a03-8f8d-70698d5146b3-kube-api-access-r824c\") pod \"openstack-baremetal-operator-controller-manager-659bb84579hvlsl\" (UID: \"3f17e8a8-3251-4a03-8f8d-70698d5146b3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.238806 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-pmhs8" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.267677 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-f6wpw" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.276328 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.290712 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4ns6\" (UniqueName: \"kubernetes.io/projected/e0ef9126-ad52-4803-b11d-f8b1712c4efd-kube-api-access-r4ns6\") pod \"watcher-operator-controller-manager-68d7bc5569-4g45w\" (UID: \"e0ef9126-ad52-4803-b11d-f8b1712c4efd\") " pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.290771 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngkv6\" (UniqueName: \"kubernetes.io/projected/f7389885-bb97-4d1d-8c55-9d3f76eda8ed-kube-api-access-ngkv6\") pod \"test-operator-controller-manager-cbdf6dc66-r9twl\" (UID: \"f7389885-bb97-4d1d-8c55-9d3f76eda8ed\") " pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.320899 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngkv6\" (UniqueName: \"kubernetes.io/projected/f7389885-bb97-4d1d-8c55-9d3f76eda8ed-kube-api-access-ngkv6\") pod \"test-operator-controller-manager-cbdf6dc66-r9twl\" (UID: \"f7389885-bb97-4d1d-8c55-9d3f76eda8ed\") " pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.333442 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-pvvml" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.349099 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.360649 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-2djx2" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.361024 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.362075 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.364934 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wnt2c" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.369356 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.382885 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.383388 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.384172 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.384727 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.386130 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-r64gq" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.389877 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-p6xjl"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.391909 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad1d9777-734a-412f-a917-8d6b497dcb32-cert\") pod \"openstack-operator-controller-manager-6c7b6bcb7c-dvhfd\" (UID: \"ad1d9777-734a-412f-a917-8d6b497dcb32\") " pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.391977 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzn4z\" (UniqueName: \"kubernetes.io/projected/ad1d9777-734a-412f-a917-8d6b497dcb32-kube-api-access-zzn4z\") pod \"openstack-operator-controller-manager-6c7b6bcb7c-dvhfd\" (UID: \"ad1d9777-734a-412f-a917-8d6b497dcb32\") " pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.392003 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4ns6\" (UniqueName: \"kubernetes.io/projected/e0ef9126-ad52-4803-b11d-f8b1712c4efd-kube-api-access-r4ns6\") pod \"watcher-operator-controller-manager-68d7bc5569-4g45w\" (UID: \"e0ef9126-ad52-4803-b11d-f8b1712c4efd\") " pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.392058 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qhx8\" (UniqueName: \"kubernetes.io/projected/b0d4dbf5-aa13-46ec-ab24-5c43a0be638c-kube-api-access-6qhx8\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5\" (UID: \"b0d4dbf5-aa13-46ec-ab24-5c43a0be638c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.407708 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4ns6\" (UniqueName: \"kubernetes.io/projected/e0ef9126-ad52-4803-b11d-f8b1712c4efd-kube-api-access-r4ns6\") pod \"watcher-operator-controller-manager-68d7bc5569-4g45w\" (UID: \"e0ef9126-ad52-4803-b11d-f8b1712c4efd\") " pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.446625 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.493218 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad1d9777-734a-412f-a917-8d6b497dcb32-cert\") pod \"openstack-operator-controller-manager-6c7b6bcb7c-dvhfd\" (UID: \"ad1d9777-734a-412f-a917-8d6b497dcb32\") " pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.493300 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzn4z\" (UniqueName: \"kubernetes.io/projected/ad1d9777-734a-412f-a917-8d6b497dcb32-kube-api-access-zzn4z\") pod \"openstack-operator-controller-manager-6c7b6bcb7c-dvhfd\" (UID: \"ad1d9777-734a-412f-a917-8d6b497dcb32\") " pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.493360 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qhx8\" (UniqueName: \"kubernetes.io/projected/b0d4dbf5-aa13-46ec-ab24-5c43a0be638c-kube-api-access-6qhx8\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5\" (UID: \"b0d4dbf5-aa13-46ec-ab24-5c43a0be638c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5" Oct 01 12:51:27 crc kubenswrapper[4913]: E1001 12:51:27.493382 4913 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 01 12:51:27 crc kubenswrapper[4913]: E1001 12:51:27.493445 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad1d9777-734a-412f-a917-8d6b497dcb32-cert podName:ad1d9777-734a-412f-a917-8d6b497dcb32 nodeName:}" failed. No retries permitted until 2025-10-01 12:51:27.993427655 +0000 UTC m=+819.896903233 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad1d9777-734a-412f-a917-8d6b497dcb32-cert") pod "openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" (UID: "ad1d9777-734a-412f-a917-8d6b497dcb32") : secret "webhook-server-cert" not found Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.502425 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-wdw6b"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.510024 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzn4z\" (UniqueName: \"kubernetes.io/projected/ad1d9777-734a-412f-a917-8d6b497dcb32-kube-api-access-zzn4z\") pod \"openstack-operator-controller-manager-6c7b6bcb7c-dvhfd\" (UID: \"ad1d9777-734a-412f-a917-8d6b497dcb32\") " pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.516316 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qhx8\" (UniqueName: \"kubernetes.io/projected/b0d4dbf5-aa13-46ec-ab24-5c43a0be638c-kube-api-access-6qhx8\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5\" (UID: \"b0d4dbf5-aa13-46ec-ab24-5c43a0be638c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.542748 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w" Oct 01 12:51:27 crc kubenswrapper[4913]: W1001 12:51:27.548614 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod086685ca_996d_44d9_bd02_33cc99e5dab9.slice/crio-bf063468f3a21b64b80c729abb9d9bd64a9576e2ac1506bc7bd8dabe4608a5c7 WatchSource:0}: Error finding container bf063468f3a21b64b80c729abb9d9bd64a9576e2ac1506bc7bd8dabe4608a5c7: Status 404 returned error can't find the container with id bf063468f3a21b64b80c729abb9d9bd64a9576e2ac1506bc7bd8dabe4608a5c7 Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.624936 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-gq8sp"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.628012 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-pbjqc"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.695328 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f17e8a8-3251-4a03-8f8d-70698d5146b3-cert\") pod \"openstack-baremetal-operator-controller-manager-659bb84579hvlsl\" (UID: \"3f17e8a8-3251-4a03-8f8d-70698d5146b3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" Oct 01 12:51:27 crc kubenswrapper[4913]: W1001 12:51:27.695503 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7b4ec90_a547_48a1_83ac_3528c53f90f0.slice/crio-a724bd321b8cc4468d6c0536960104d813b9adecdff53464abee030e26f56d47 WatchSource:0}: Error finding container a724bd321b8cc4468d6c0536960104d813b9adecdff53464abee030e26f56d47: Status 404 returned error can't find the container with id a724bd321b8cc4468d6c0536960104d813b9adecdff53464abee030e26f56d47 Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.701739 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f17e8a8-3251-4a03-8f8d-70698d5146b3-cert\") pod \"openstack-baremetal-operator-controller-manager-659bb84579hvlsl\" (UID: \"3f17e8a8-3251-4a03-8f8d-70698d5146b3\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" Oct 01 12:51:27 crc kubenswrapper[4913]: W1001 12:51:27.707373 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb312f93_72ce_4d3e_9d89_66526e40dca2.slice/crio-4e6a809a111f235d15847450a864319f5e5de9311c31191b577c495de29fac4b WatchSource:0}: Error finding container 4e6a809a111f235d15847450a864319f5e5de9311c31191b577c495de29fac4b: Status 404 returned error can't find the container with id 4e6a809a111f235d15847450a864319f5e5de9311c31191b577c495de29fac4b Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.745936 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5" Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.746927 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-gq8sp" event={"ID":"d7b4ec90-a547-48a1-83ac-3528c53f90f0","Type":"ContainerStarted","Data":"a724bd321b8cc4468d6c0536960104d813b9adecdff53464abee030e26f56d47"} Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.751416 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-wdw6b" event={"ID":"086685ca-996d-44d9-bd02-33cc99e5dab9","Type":"ContainerStarted","Data":"bf063468f3a21b64b80c729abb9d9bd64a9576e2ac1506bc7bd8dabe4608a5c7"} Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.755334 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-p6xjl" event={"ID":"9d418cfb-ec47-4ba7-b29b-5e68fddf11e4","Type":"ContainerStarted","Data":"ee5bf6ebaa32e012266c56540a710d350a36165ae36544fee9139d82531d1789"} Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.812488 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf"] Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.826743 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-mh5sx"] Oct 01 12:51:27 crc kubenswrapper[4913]: W1001 12:51:27.901434 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ac2ed71_cb90_4003_b8f9_5ad6748c08d5.slice/crio-69231c062a751e22af46f4d32f6c70f913e526a4c622328420cd4b92744e7279 WatchSource:0}: Error finding container 69231c062a751e22af46f4d32f6c70f913e526a4c622328420cd4b92744e7279: Status 404 returned error can't find the container with id 69231c062a751e22af46f4d32f6c70f913e526a4c622328420cd4b92744e7279 Oct 01 12:51:27 crc kubenswrapper[4913]: I1001 12:51:27.902199 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.006135 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad1d9777-734a-412f-a917-8d6b497dcb32-cert\") pod \"openstack-operator-controller-manager-6c7b6bcb7c-dvhfd\" (UID: \"ad1d9777-734a-412f-a917-8d6b497dcb32\") " pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" Oct 01 12:51:28 crc kubenswrapper[4913]: E1001 12:51:28.006386 4913 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 01 12:51:28 crc kubenswrapper[4913]: E1001 12:51:28.006453 4913 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad1d9777-734a-412f-a917-8d6b497dcb32-cert podName:ad1d9777-734a-412f-a917-8d6b497dcb32 nodeName:}" failed. No retries permitted until 2025-10-01 12:51:29.006434179 +0000 UTC m=+820.909909767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad1d9777-734a-412f-a917-8d6b497dcb32-cert") pod "openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" (UID: "ad1d9777-734a-412f-a917-8d6b497dcb32") : secret "webhook-server-cert" not found Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.007585 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-rxgj5"] Oct 01 12:51:28 crc kubenswrapper[4913]: W1001 12:51:28.016654 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b36494_1091_44b0_b303_ac62e5cef841.slice/crio-f37fe76e0bf1cf2dd75516010347fc8fd65dbe729ded668c8728d2f6be18512e WatchSource:0}: Error finding container f37fe76e0bf1cf2dd75516010347fc8fd65dbe729ded668c8728d2f6be18512e: Status 404 returned error can't find the container with id f37fe76e0bf1cf2dd75516010347fc8fd65dbe729ded668c8728d2f6be18512e Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.084263 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f45cd594f-hdrq8"] Oct 01 12:51:28 crc kubenswrapper[4913]: W1001 12:51:28.089032 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13e74825_5720_4c49_97e9_a0fccf649b50.slice/crio-70a003cedcd37855f29f8aae8482bc2dddd28f549fc4610429194b3eff72efad WatchSource:0}: Error finding container 70a003cedcd37855f29f8aae8482bc2dddd28f549fc4610429194b3eff72efad: Status 404 returned error can't find the container with id 70a003cedcd37855f29f8aae8482bc2dddd28f549fc4610429194b3eff72efad Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.091777 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-5jvz7"] Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.098431 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-s7zlm"] Oct 01 12:51:28 crc kubenswrapper[4913]: W1001 12:51:28.101081 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ec32c8_323f_4c74_bf82_4dc2a70db41a.slice/crio-f001996f38a5058248079bf2b3078bfe14a734e0a7772957fb6d38ce7d1239a6 WatchSource:0}: Error finding container f001996f38a5058248079bf2b3078bfe14a734e0a7772957fb6d38ce7d1239a6: Status 404 returned error can't find the container with id f001996f38a5058248079bf2b3078bfe14a734e0a7772957fb6d38ce7d1239a6 Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.564514 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-2djx2"] Oct 01 12:51:28 crc kubenswrapper[4913]: W1001 12:51:28.579540 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1549da87_3151_467a_92d6_de0709a3a6a7.slice/crio-69cbb47bfe49e0a78fc8caa69f2caa66224a254587b8838385b5984c422617ff WatchSource:0}: Error finding container 69cbb47bfe49e0a78fc8caa69f2caa66224a254587b8838385b5984c422617ff: Status 404 returned error can't find the container with id 69cbb47bfe49e0a78fc8caa69f2caa66224a254587b8838385b5984c422617ff Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.593466 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl"] Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.607779 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m"] Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.618129 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nsm78"] Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.623440 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7"] Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.626368 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-75f8d67d86-pmhs8"] Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.636864 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-f6wpw"] Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.645971 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-pvvml"] Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.649581 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d"] Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.653049 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl"] Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.657916 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w"] Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.676459 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5"] Oct 01 12:51:28 crc kubenswrapper[4913]: W1001 12:51:28.730078 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf28004e7_4e00_4ffc_ae3a_cfad4022387a.slice/crio-37d8ae6cbf54ccca973d817607cb1489246f29701d0dc05de41eb989ef1c0dd4 WatchSource:0}: Error finding container 37d8ae6cbf54ccca973d817607cb1489246f29701d0dc05de41eb989ef1c0dd4: Status 404 returned error can't find the container with id 37d8ae6cbf54ccca973d817607cb1489246f29701d0dc05de41eb989ef1c0dd4 Oct 01 12:51:28 crc kubenswrapper[4913]: W1001 12:51:28.747486 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7389885_bb97_4d1d_8c55_9d3f76eda8ed.slice/crio-ec139ab227cf61737a69428bede1db42c223edbfe4fb34470a4b721e62a07c40 WatchSource:0}: Error finding container ec139ab227cf61737a69428bede1db42c223edbfe4fb34470a4b721e62a07c40: Status 404 returned error can't find the container with id ec139ab227cf61737a69428bede1db42c223edbfe4fb34470a4b721e62a07c40 Oct 01 12:51:28 crc kubenswrapper[4913]: W1001 12:51:28.748723 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61fd19a3_0b06_401a_9bf3_9d2a34bbf291.slice/crio-3a6ffef8101b7e5373a09c41a9d88b332019cd18289eeae2bd0f68077429b786 WatchSource:0}: Error finding container 3a6ffef8101b7e5373a09c41a9d88b332019cd18289eeae2bd0f68077429b786: Status 404 returned error can't find the container with id 3a6ffef8101b7e5373a09c41a9d88b332019cd18289eeae2bd0f68077429b786 Oct 01 12:51:28 crc kubenswrapper[4913]: W1001 12:51:28.750199 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7681632_31dc_4278_889a_8b89ddccac74.slice/crio-cd8b277b0af004891eb93f36619fbaffbdc4a500e4cef52e13f4a3ef6fd0e7ec WatchSource:0}: Error finding container cd8b277b0af004891eb93f36619fbaffbdc4a500e4cef52e13f4a3ef6fd0e7ec: Status 404 returned error can't find the container with id cd8b277b0af004891eb93f36619fbaffbdc4a500e4cef52e13f4a3ef6fd0e7ec Oct 01 12:51:28 crc kubenswrapper[4913]: E1001 12:51:28.784554 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rdxzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7fd5b6bbc6-45w5m_openstack-operators(a34a6147-f982-48cf-9976-b39c4dd420cf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:51:28 crc kubenswrapper[4913]: E1001 12:51:28.784623 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l9r42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-54fbbfcd44-f8tw7_openstack-operators(e80ad896-dd86-43e6-850b-f12088a61cf5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.787320 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-s7zlm" event={"ID":"e5ec32c8-323f-4c74-bf82-4dc2a70db41a","Type":"ContainerStarted","Data":"f001996f38a5058248079bf2b3078bfe14a734e0a7772957fb6d38ce7d1239a6"} Oct 01 12:51:28 crc kubenswrapper[4913]: E1001 12:51:28.787636 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r4ns6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-68d7bc5569-4g45w_openstack-operators(e0ef9126-ad52-4803-b11d-f8b1712c4efd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.789209 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-5jvz7" event={"ID":"13e74825-5720-4c49-97e9-a0fccf649b50","Type":"ContainerStarted","Data":"70a003cedcd37855f29f8aae8482bc2dddd28f549fc4610429194b3eff72efad"} Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.790748 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-f6wpw" event={"ID":"f28004e7-4e00-4ffc-ae3a-cfad4022387a","Type":"ContainerStarted","Data":"37d8ae6cbf54ccca973d817607cb1489246f29701d0dc05de41eb989ef1c0dd4"} Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.793150 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-2djx2" event={"ID":"1549da87-3151-467a-92d6-de0709a3a6a7","Type":"ContainerStarted","Data":"69cbb47bfe49e0a78fc8caa69f2caa66224a254587b8838385b5984c422617ff"} Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.794902 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nsm78" event={"ID":"8f9b91c6-b4e9-44b3-83c2-42412d48de96","Type":"ContainerStarted","Data":"fd4a62929fad30aa99c7e33e741ad70809a243139ed0951ca21cd4e04ba2813a"} Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.796824 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-hdrq8" event={"ID":"5c79503f-3029-498b-b9b1-e2df43820cb2","Type":"ContainerStarted","Data":"d96374476d3dcf872762d6dbc7f84df03d12ea0b59342c866d9b710084054d99"} Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.798793 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-mh5sx" event={"ID":"845f41f1-b619-4984-aa2a-bae46992d463","Type":"ContainerStarted","Data":"1aa9f251248ee3268eb1f5f24d0f09da0515e14ac42bb60605aaafad36bb59f9"} Oct 01 12:51:28 crc kubenswrapper[4913]: W1001 12:51:28.799081 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0d4dbf5_aa13_46ec_ab24_5c43a0be638c.slice/crio-bedddfb1fd54bf6b7dd9a159400a79cb1c590215e1b6e511a9a91753ee076194 WatchSource:0}: Error finding container bedddfb1fd54bf6b7dd9a159400a79cb1c590215e1b6e511a9a91753ee076194: Status 404 returned error can't find the container with id bedddfb1fd54bf6b7dd9a159400a79cb1c590215e1b6e511a9a91753ee076194 Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.801054 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-pvvml" event={"ID":"61fd19a3-0b06-401a-9bf3-9d2a34bbf291","Type":"ContainerStarted","Data":"3a6ffef8101b7e5373a09c41a9d88b332019cd18289eeae2bd0f68077429b786"} Oct 01 12:51:28 crc kubenswrapper[4913]: W1001 12:51:28.801963 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda19e1891_5907_42dc_9894_6c9b3bcb5cce.slice/crio-12fcc760f3fd7a3d6047a0bd315004b2afb7c1e62a499ae4698d20d20ebb2c4e WatchSource:0}: Error finding container 12fcc760f3fd7a3d6047a0bd315004b2afb7c1e62a499ae4698d20d20ebb2c4e: Status 404 returned error can't find the container with id 12fcc760f3fd7a3d6047a0bd315004b2afb7c1e62a499ae4698d20d20ebb2c4e Oct 01 12:51:28 crc kubenswrapper[4913]: W1001 12:51:28.802322 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f17e8a8_3251_4a03_8f8d_70698d5146b3.slice/crio-bf5e339ae79f3d3e6e27a946da9db0076f09caa0f20036b8b57eff99fc545406 WatchSource:0}: Error finding container bf5e339ae79f3d3e6e27a946da9db0076f09caa0f20036b8b57eff99fc545406: Status 404 returned error can't find the container with id bf5e339ae79f3d3e6e27a946da9db0076f09caa0f20036b8b57eff99fc545406 Oct 01 12:51:28 crc kubenswrapper[4913]: E1001 12:51:28.803098 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6qhx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5_openstack-operators(b0d4dbf5-aa13-46ec-ab24-5c43a0be638c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:51:28 crc kubenswrapper[4913]: E1001 12:51:28.804207 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5" podUID="b0d4dbf5-aa13-46ec-ab24-5c43a0be638c" Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.824590 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-pmhs8" event={"ID":"b7681632-31dc-4278-889a-8b89ddccac74","Type":"ContainerStarted","Data":"cd8b277b0af004891eb93f36619fbaffbdc4a500e4cef52e13f4a3ef6fd0e7ec"} Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.824629 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-pbjqc" event={"ID":"cb312f93-72ce-4d3e-9d89-66526e40dca2","Type":"ContainerStarted","Data":"4e6a809a111f235d15847450a864319f5e5de9311c31191b577c495de29fac4b"} Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.826918 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-rxgj5" event={"ID":"55b36494-1091-44b0-b303-ac62e5cef841","Type":"ContainerStarted","Data":"f37fe76e0bf1cf2dd75516010347fc8fd65dbe729ded668c8728d2f6be18512e"} Oct 01 12:51:28 crc kubenswrapper[4913]: E1001 12:51:28.829373 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-swjd8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-689b4f76c9-8668d_openstack-operators(a19e1891-5907-42dc-9894-6c9b3bcb5cce): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.830407 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" event={"ID":"f7389885-bb97-4d1d-8c55-9d3f76eda8ed","Type":"ContainerStarted","Data":"ec139ab227cf61737a69428bede1db42c223edbfe4fb34470a4b721e62a07c40"} Oct 01 12:51:28 crc kubenswrapper[4913]: E1001 12:51:28.830526 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:9739588b6480acdeada79842182c7e8507dc4f3669be8330591460ffd44cdcec,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:9774f19d7a63d6f516afa701fb5f031674ad537e595049bbc57817356c7642fe,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:29c8cd4f2d853f512e2ecd44f522f28c3aac046a72733365aa5e91667041d62e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:ed896681f0d9720f56bbcb0b7a4f3626ed397e89af919604ca68b42b7b598859,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:712e1c932a90ef5e3c3ee5d5aea591a377da8c4af604ebd8ec399869a61dfbef,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:10fd8489a5bf6f1d781e9226de68356132db78b62269e69d632748cb08fae725,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:73fd28af83ea96cc920d26dba6105ee59f0824234527949884e6ca55b71d7533,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:8b3a90516ba0695cf3198a7b101da770c30c8100cb79f8088b5729e6a50ddd6d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:6d42bcf65422d2de9cd807feb3e8b005de10084b4b8eb340c8a9045644ae7aaa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:32a25ac44706b73bff04a89514177b1efd675f0442b295e225f0020555ca6350,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:b19043eac7c653e00da8da9418ae378fdd29698adb1adb4bf5ae7cfc03ba5538,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:c486e00b36ea7698d6a4cd9048a759bad5a8286e4949bbd1f82c3ddb70600b9b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:ef2727f0300fbf3bf15d8ddc409d0fd63e4aac9dd64c86459bd6ff64fc6b9534,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:e318869f706836a0c74c0ad55aab277b1bb7fae0555ae0f03cb28b379b9ce695,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:329aac65ba00c3cf43bb1d5fac8818752f01de90b47719e2a84db4e2fe083292,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:6ce73885ac1ee7c69468efc448eff5deae46502812c5e3d099f771e1cc03345f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:282cc0fcdbb8a688dd62a2499480aae4a36b620f2160d51e6c8269e6cc32d5fc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:d98c0c9d3bdd84daf4b98d45b8bbe2e67a633491897dda7167664a5fa1f0f26e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:4ad1d36fe1c8992e43910fc2d566b991fd73f9b82b1ab860c66858448ff82c00,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:92789eab1b8a91807a5e898cb63478d125ae539eafe63c96049100c6ddeadb04,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:ee9832268e0df5d62c50c5ce171e9ef72a035aa74c718cfbf482e34426d8d15e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:07b4f96f24f32224c13613f85173f9fcc3092b8797ffa47519403d124bfe4c15,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:3a873c95bcb7ae8bd24ff1eb5fe89ac5272a41a3345a7b41d55419b5d66b70e7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:388dbae2f1aae2720e919cc24d10cd577b73b4e4ef7abdc34287bcb8d27ff98f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:d4c1b2496868da3dcca9f4bda0834fcc58d23c21d8ce3c42a68205d02039c487,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:c4414cc2680fb1bacbf99261f759f4ef7401fb2e4953140270bffdab8e002f22,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:b9b950a656f1456b3143872c492b0987bf4a9e23bc7c59d843cf50099667b368,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:afd5d6822b86ea0930b2011fede834bb24495995d7baac03363ab61d89f07a22,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:665d7a25dfc959ec5448d5ba6b430792ebde1be1580ea6809e9b3b4f94184b3f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:499c6d82390ee2dbb91628d2e42671406372fb603d697685a04145cf6dd8d0ab,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:da2736bc98bfe340e86234523d4c00220f6f79add271900981cf4ad9f4c5ee51,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:4df8dad8a5fb4805a0424cbc0b8df666b9a06b76c64f26e186f3b9e8efe6cd95,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:65c16453b5b7bb113646ffce0be26138e89eecbf6dd1582cdfe76af7f5dc62cf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:bdfed2a176a064bf70082602a1f319eace2d9003ff1117b1e48b7f2130840070,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:ce968dce2209ec5114772b4b73ed16c0a25988637372f2afbfac080cc6f1e378,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:b7823eaacf55280cdf3f1bede4f40bf49fdbf9ba9f3f5ba64b0abedede601c8f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:605206d967ffaa20156eb07a645654cd3e0f880bb0eefbb2b5e1e749b169f148,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:9470db6caf5102cf37ddb1f137f17b05ef7119f174f4189beb4839ef7f65730c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:34e84da4ae7e5d65931cbefcda84fd8fdc93271ec466adf1a9040b67a3af176a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:b301b17c31e47733a8a232773427ce3cb50433a3aa09d4a5bd998b1aeb5e5530,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:d642c35c0f9d3acf31987c028f1d4d4fdf7b49e1d6cbcd73268c12b3d6e14b86,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:922eb0799ab36a91aa95abe52565dc60db807457dbf8c651b30e06b9e8aebcd4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:cd01e9605ab513458a6813e38d37fbfde1a91388cc5c00962203dbcbdc285e79,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:dd35c22b17730cbca8547ea98459f182939462c8dc3465d21335a377018937de,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:0e0e2e48a41d5417f1d6a4407e63d443611b7eacd66e27f561c9eedf3e5a66c5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:735bd24219fdb5f21c31313a5bc685364f45c004fb5e8af634984c147060d4e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:35b5554efae34f2c25a2d274c78bdaecf3d4ce949fa61c692835ee54cdfc6d74,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:01b93ab0d87482b9a1fd46706771974743dea1ca74f5fcc3de4a560f7cfc033b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:87471fbe3ba77b7115096f4fef8f5a9e1468cbd5bf6060c09785a60f9107a717,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:947dcc46173064939cba252d5db34eb6ddd05eb0af7afd762beebe77e9a72c6e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:8498ed720d02ce4e7045f7eb0051b138274cddba9b1e443d11e413da3474d3a3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:2cb054830655a6af5fc6848360618676d24fd9cf15078c0b9855e09d05733eec,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:0f5f8f560cd3b4951f7e8e67ef570575435b4c6915658cbb66f32a201776078b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:7055e8d7b7d72ce697c6077be14c525c019d186002f04765b90a14c82e01cc7c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:d2cd7a21461b4b569d93a63d57761f437cf6bd0847d69a3a65f64d400c7cca6d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:432c0c6f36a5e4e4db394771f7dc72f3bf9e5060dc4220f781d3c5050cc17f0d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:3ff379a74cc15352bfa25605dbb1a5f4250620e8364bf87ed2f3d5c17e6a8b26,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:c67a7bba2fc9351c302369b590473a737bab20d0982d227756fe1fa0bc1c8773,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:50c613d159667a26ba4bfb7aebf157b8db8919c815a866438b1d2700231a508e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:f3d3d7a7c83926a09714199406bfe8070e6be5055cbfbf00aa37f47e1e5e9bc9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:e9b3260907b0e417bb779a7d513a2639734cbbf792e77c61e05e760d06978f4a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:1aa6a76e67f2d91ee45472741238b5d4ab53f9bcb94db678c7ae92e1af28899d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:80b8547cf5821a4eb5461d1ac14edbc700ef03926268af960bf511647de027af,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content@sha256:5b82cdbfa30e915f97ab6f3726b60582c7b62a819e4aa4e87cf42fc7495b4ef9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:bf42dfd2e225818662aa28c4bb23204dc47b2b91127ca0e49b085baa1ea7609d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:bd08ffdb4dcfd436200d846d15b2bdcc14122fa43adfea4c0980a087a18f9e3e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:2d1e733d24df6ca02636374147f801a0ec1509f8db2f9ad8c739b3f2341815fd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:c08ba2a0df4cc18e615b25c329e9c74153709b435c032c38502ec78ba297c5fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:b6cdafc7722def5b63ef4f00251e10aca93ef82628b21e88925c3d4b49277316,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:0a0bbe43e3c266dfeb40a09036f76393dc70377b636724c130a29c434f6d6c82,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:7387b628d7cfb3ff349e0df6f11f41ae7fdb0e2d55844944896af02a81ac7cf7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:9a3671dee1752ebe3639a0b16de95d29e779f1629d563e0585d65b9792542fc9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:37cc031749b113c35231066ce9f8ce7ccc83e21808ba92ea1981e72bbc42e80f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:b2782fe02b1438d68308a5847b0628f0971b5bb8bb0a4d20fe15176fa75bd33f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:7118cc3a695fead2a8bab14c8ace018ed7a5ba23ef347bf4ead44219e8467866,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:793a836e17b07b0e0a4e8d3177fd04724e1e058fca275ef434abe60a2e444a79,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:713d74dc81859344bdcae68a9f7a954146c3e68cfa819518a58cce9e896298c8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:e39be536015777a1b0df8ac863f354046b2b15fee8482abd37d2fa59d8074208,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:6ab460c1ec80799499eae55bb8cad0ac3bd3e501c7abe57b665e58921ca88063,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:f3d12096a1cd68b1aa837f46a42418ba8a11ca2d18dcb63e5c16d15986f28d4c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:700e3619916939c838032c130f0e4a0337a628598ae6a7b752a8e4733bb231e0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r824c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-659bb84579hvlsl_openstack-operators(3f17e8a8-3251-4a03-8f8d-70698d5146b3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:51:28 crc kubenswrapper[4913]: I1001 12:51:28.831854 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf" event={"ID":"5ac2ed71-cb90-4003-b8f9-5ad6748c08d5","Type":"ContainerStarted","Data":"69231c062a751e22af46f4d32f6c70f913e526a4c622328420cd4b92744e7279"} Oct 01 12:51:29 crc kubenswrapper[4913]: E1001 12:51:29.017161 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d" podUID="a19e1891-5907-42dc-9894-6c9b3bcb5cce" Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.023997 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad1d9777-734a-412f-a917-8d6b497dcb32-cert\") pod \"openstack-operator-controller-manager-6c7b6bcb7c-dvhfd\" (UID: \"ad1d9777-734a-412f-a917-8d6b497dcb32\") " pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.054801 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad1d9777-734a-412f-a917-8d6b497dcb32-cert\") pod \"openstack-operator-controller-manager-6c7b6bcb7c-dvhfd\" (UID: \"ad1d9777-734a-412f-a917-8d6b497dcb32\") " pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" Oct 01 12:51:29 crc kubenswrapper[4913]: E1001 12:51:29.083737 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7" podUID="e80ad896-dd86-43e6-850b-f12088a61cf5" Oct 01 12:51:29 crc kubenswrapper[4913]: E1001 12:51:29.094362 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w" podUID="e0ef9126-ad52-4803-b11d-f8b1712c4efd" Oct 01 12:51:29 crc kubenswrapper[4913]: E1001 12:51:29.155974 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m" podUID="a34a6147-f982-48cf-9976-b39c4dd420cf" Oct 01 12:51:29 crc kubenswrapper[4913]: E1001 12:51:29.167573 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" podUID="3f17e8a8-3251-4a03-8f8d-70698d5146b3" Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.224762 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.819486 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd"] Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.855695 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" event={"ID":"3f17e8a8-3251-4a03-8f8d-70698d5146b3","Type":"ContainerStarted","Data":"c839a7bedeb61f1d43c2da0c53dbe366a73287f84ab67da18d66ed59ac685e4a"} Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.855738 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" event={"ID":"3f17e8a8-3251-4a03-8f8d-70698d5146b3","Type":"ContainerStarted","Data":"bf5e339ae79f3d3e6e27a946da9db0076f09caa0f20036b8b57eff99fc545406"} Oct 01 12:51:29 crc kubenswrapper[4913]: E1001 12:51:29.864405 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" podUID="3f17e8a8-3251-4a03-8f8d-70698d5146b3" Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.865358 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m" event={"ID":"a34a6147-f982-48cf-9976-b39c4dd420cf","Type":"ContainerStarted","Data":"bdf166c6b23aebfad179c7d730658b4db9f23e31570dd9636123b5dee37918eb"} Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.865389 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m" event={"ID":"a34a6147-f982-48cf-9976-b39c4dd420cf","Type":"ContainerStarted","Data":"2b04b00e02790a32478aaf28dd884c4643e51688297007fb88eec997a94a65de"} Oct 01 12:51:29 crc kubenswrapper[4913]: E1001 12:51:29.866582 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m" podUID="a34a6147-f982-48cf-9976-b39c4dd420cf" Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.884463 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5" event={"ID":"b0d4dbf5-aa13-46ec-ab24-5c43a0be638c","Type":"ContainerStarted","Data":"bedddfb1fd54bf6b7dd9a159400a79cb1c590215e1b6e511a9a91753ee076194"} Oct 01 12:51:29 crc kubenswrapper[4913]: E1001 12:51:29.888626 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5" podUID="b0d4dbf5-aa13-46ec-ab24-5c43a0be638c" Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.895302 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w" event={"ID":"e0ef9126-ad52-4803-b11d-f8b1712c4efd","Type":"ContainerStarted","Data":"a3d0765479886ed9d077565d79499e71d12de84d8a2bdb12f64946fec329d3d6"} Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.895348 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w" event={"ID":"e0ef9126-ad52-4803-b11d-f8b1712c4efd","Type":"ContainerStarted","Data":"b05ebc022d71dcfeae01d9d7f069260e683d6cfc47af0f5c5f260b50eaaf381b"} Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.901433 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7" event={"ID":"e80ad896-dd86-43e6-850b-f12088a61cf5","Type":"ContainerStarted","Data":"986dcd15bea41ce77c1d42ed14583160f9ae9c5acaad1a8480ffc3b398e3be18"} Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.901482 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7" event={"ID":"e80ad896-dd86-43e6-850b-f12088a61cf5","Type":"ContainerStarted","Data":"7b29362b50d99564ed07bc6b7195ae27a069f8d3e81dd6d45dd6d5d59d7eb08a"} Oct 01 12:51:29 crc kubenswrapper[4913]: E1001 12:51:29.903793 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w" podUID="e0ef9126-ad52-4803-b11d-f8b1712c4efd" Oct 01 12:51:29 crc kubenswrapper[4913]: E1001 12:51:29.903909 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7" podUID="e80ad896-dd86-43e6-850b-f12088a61cf5" Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.905110 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d" event={"ID":"a19e1891-5907-42dc-9894-6c9b3bcb5cce","Type":"ContainerStarted","Data":"cf1edc2e30a9ac864361ce9276924b09362463847dd0e1a0b2f31253d04eb7ba"} Oct 01 12:51:29 crc kubenswrapper[4913]: I1001 12:51:29.905130 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d" event={"ID":"a19e1891-5907-42dc-9894-6c9b3bcb5cce","Type":"ContainerStarted","Data":"12fcc760f3fd7a3d6047a0bd315004b2afb7c1e62a499ae4698d20d20ebb2c4e"} Oct 01 12:51:29 crc kubenswrapper[4913]: E1001 12:51:29.908219 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d" podUID="a19e1891-5907-42dc-9894-6c9b3bcb5cce" Oct 01 12:51:30 crc kubenswrapper[4913]: I1001 12:51:30.942040 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" event={"ID":"ad1d9777-734a-412f-a917-8d6b497dcb32","Type":"ContainerStarted","Data":"cbb35ccb36b46eb27be7da02f35ef08cf329d83d11d2b22d1b62ab141b214e8c"} Oct 01 12:51:30 crc kubenswrapper[4913]: I1001 12:51:30.942417 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" Oct 01 12:51:30 crc kubenswrapper[4913]: I1001 12:51:30.942434 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" event={"ID":"ad1d9777-734a-412f-a917-8d6b497dcb32","Type":"ContainerStarted","Data":"6ba0821dd154be147b89038369f48911fe3162d903148a1f13451c2364c3ce3e"} Oct 01 12:51:30 crc kubenswrapper[4913]: I1001 12:51:30.942447 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" event={"ID":"ad1d9777-734a-412f-a917-8d6b497dcb32","Type":"ContainerStarted","Data":"5832052101b754e47eeb4a9d2eac8ac65ae082bb0c981f2cc917204ab33c3e99"} Oct 01 12:51:30 crc kubenswrapper[4913]: E1001 12:51:30.951228 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w" podUID="e0ef9126-ad52-4803-b11d-f8b1712c4efd" Oct 01 12:51:30 crc kubenswrapper[4913]: E1001 12:51:30.951381 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5" podUID="b0d4dbf5-aa13-46ec-ab24-5c43a0be638c" Oct 01 12:51:30 crc kubenswrapper[4913]: E1001 12:51:30.951743 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" podUID="3f17e8a8-3251-4a03-8f8d-70698d5146b3" Oct 01 12:51:30 crc kubenswrapper[4913]: E1001 12:51:30.953505 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d" podUID="a19e1891-5907-42dc-9894-6c9b3bcb5cce" Oct 01 12:51:30 crc kubenswrapper[4913]: E1001 12:51:30.954555 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m" podUID="a34a6147-f982-48cf-9976-b39c4dd420cf" Oct 01 12:51:30 crc kubenswrapper[4913]: E1001 12:51:30.974150 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7" podUID="e80ad896-dd86-43e6-850b-f12088a61cf5" Oct 01 12:51:31 crc kubenswrapper[4913]: I1001 12:51:31.004192 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" podStartSLOduration=4.0041707 podStartE2EDuration="4.0041707s" podCreationTimestamp="2025-10-01 12:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:51:31.000456938 +0000 UTC m=+822.903932516" watchObservedRunningTime="2025-10-01 12:51:31.0041707 +0000 UTC m=+822.907646278" Oct 01 12:51:35 crc kubenswrapper[4913]: I1001 12:51:35.168906 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l945r"] Oct 01 12:51:35 crc kubenswrapper[4913]: I1001 12:51:35.170493 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:35 crc kubenswrapper[4913]: I1001 12:51:35.180065 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l945r"] Oct 01 12:51:35 crc kubenswrapper[4913]: I1001 12:51:35.336875 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vj6b\" (UniqueName: \"kubernetes.io/projected/5dddf17f-397f-42c9-a202-94d68fe00c52-kube-api-access-8vj6b\") pod \"community-operators-l945r\" (UID: \"5dddf17f-397f-42c9-a202-94d68fe00c52\") " pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:35 crc kubenswrapper[4913]: I1001 12:51:35.337179 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dddf17f-397f-42c9-a202-94d68fe00c52-catalog-content\") pod \"community-operators-l945r\" (UID: \"5dddf17f-397f-42c9-a202-94d68fe00c52\") " pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:35 crc kubenswrapper[4913]: I1001 12:51:35.337474 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dddf17f-397f-42c9-a202-94d68fe00c52-utilities\") pod \"community-operators-l945r\" (UID: \"5dddf17f-397f-42c9-a202-94d68fe00c52\") " pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:35 crc kubenswrapper[4913]: I1001 12:51:35.438396 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vj6b\" (UniqueName: \"kubernetes.io/projected/5dddf17f-397f-42c9-a202-94d68fe00c52-kube-api-access-8vj6b\") pod \"community-operators-l945r\" (UID: \"5dddf17f-397f-42c9-a202-94d68fe00c52\") " pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:35 crc kubenswrapper[4913]: I1001 12:51:35.438441 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dddf17f-397f-42c9-a202-94d68fe00c52-catalog-content\") pod \"community-operators-l945r\" (UID: \"5dddf17f-397f-42c9-a202-94d68fe00c52\") " pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:35 crc kubenswrapper[4913]: I1001 12:51:35.438483 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dddf17f-397f-42c9-a202-94d68fe00c52-utilities\") pod \"community-operators-l945r\" (UID: \"5dddf17f-397f-42c9-a202-94d68fe00c52\") " pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:35 crc kubenswrapper[4913]: I1001 12:51:35.439568 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dddf17f-397f-42c9-a202-94d68fe00c52-utilities\") pod \"community-operators-l945r\" (UID: \"5dddf17f-397f-42c9-a202-94d68fe00c52\") " pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:35 crc kubenswrapper[4913]: I1001 12:51:35.439877 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dddf17f-397f-42c9-a202-94d68fe00c52-catalog-content\") pod \"community-operators-l945r\" (UID: \"5dddf17f-397f-42c9-a202-94d68fe00c52\") " pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:35 crc kubenswrapper[4913]: I1001 12:51:35.460263 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vj6b\" (UniqueName: \"kubernetes.io/projected/5dddf17f-397f-42c9-a202-94d68fe00c52-kube-api-access-8vj6b\") pod \"community-operators-l945r\" (UID: \"5dddf17f-397f-42c9-a202-94d68fe00c52\") " pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:35 crc kubenswrapper[4913]: I1001 12:51:35.491020 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:39 crc kubenswrapper[4913]: I1001 12:51:39.231743 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6c7b6bcb7c-dvhfd" Oct 01 12:51:39 crc kubenswrapper[4913]: I1001 12:51:39.620803 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l945r"] Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.024150 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-mh5sx" event={"ID":"845f41f1-b619-4984-aa2a-bae46992d463","Type":"ContainerStarted","Data":"41884b09d183954095d4c00a3f832296011c896c7f95054c9010acf724e4451f"} Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.025536 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-f6wpw" event={"ID":"f28004e7-4e00-4ffc-ae3a-cfad4022387a","Type":"ContainerStarted","Data":"725cbc9441cb19887f5d3b9dbfe8667b5fc976bd17dd6d4a84ecdc3d26aed5f5"} Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.030195 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-pbjqc" event={"ID":"cb312f93-72ce-4d3e-9d89-66526e40dca2","Type":"ContainerStarted","Data":"ccd2b5d03b726813d8be21299993850ac5f65b51d1eb5b6480e10bd5f648feea"} Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.031522 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-5jvz7" event={"ID":"13e74825-5720-4c49-97e9-a0fccf649b50","Type":"ContainerStarted","Data":"ef907266bb31dd404166e1c56d6425ea4a5bfa0a0638c14ee3b8a1eb6e987150"} Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.032702 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-gq8sp" event={"ID":"d7b4ec90-a547-48a1-83ac-3528c53f90f0","Type":"ContainerStarted","Data":"72e154474838b573dcbc8f4278e625561587a429d723d1f2eaca6645eadd9378"} Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.033653 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-hdrq8" event={"ID":"5c79503f-3029-498b-b9b1-e2df43820cb2","Type":"ContainerStarted","Data":"8141c79743d20c3c52d027e160b38a15078b8774c791f014f50ba17e2e729f55"} Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.034610 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-pvvml" event={"ID":"61fd19a3-0b06-401a-9bf3-9d2a34bbf291","Type":"ContainerStarted","Data":"928cfc8553c5033c7d02c695ed0a8b5c96ed3e883e989ead3548dc50dad48b33"} Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.035686 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-wdw6b" event={"ID":"086685ca-996d-44d9-bd02-33cc99e5dab9","Type":"ContainerStarted","Data":"308eeddab19e5cb7f5936539b163593b176c7e6ed4d914dc428bcd7e4ed565d0"} Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.043737 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-rxgj5" event={"ID":"55b36494-1091-44b0-b303-ac62e5cef841","Type":"ContainerStarted","Data":"0191c172e73a50a4bea8bca40d860c9aafca3d6e7e142cb553e6c23f8367c9f6"} Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.061278 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" event={"ID":"f7389885-bb97-4d1d-8c55-9d3f76eda8ed","Type":"ContainerStarted","Data":"9454bf795bfa4c31c09f5e818ff5a1389442ce13ca104ee344c25379b216f6c0"} Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.066556 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-s7zlm" event={"ID":"e5ec32c8-323f-4c74-bf82-4dc2a70db41a","Type":"ContainerStarted","Data":"024403f5449c05d038b54fe9f142218c6002f6780e8677937d4764c75d2c9ec6"} Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.067551 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l945r" event={"ID":"5dddf17f-397f-42c9-a202-94d68fe00c52","Type":"ContainerStarted","Data":"618aee65f361fb6093d4b89773dde77cf89fbd471d3f1cda731410fc4f3e4d87"} Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.068547 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf" event={"ID":"5ac2ed71-cb90-4003-b8f9-5ad6748c08d5","Type":"ContainerStarted","Data":"ee161e557d626692f56e01a7b4426b3259c51aed92e3e12520b7e62b7f912141"} Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.069456 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nsm78" event={"ID":"8f9b91c6-b4e9-44b3-83c2-42412d48de96","Type":"ContainerStarted","Data":"e6d091538e6761f9fee41b1f17e383a62d5894cba0f3299de33f81428d982fa6"} Oct 01 12:51:40 crc kubenswrapper[4913]: I1001 12:51:40.070436 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-p6xjl" event={"ID":"9d418cfb-ec47-4ba7-b29b-5e68fddf11e4","Type":"ContainerStarted","Data":"8da882d12d2d6cecce4d707292235a32e426d7113e585c3e8fae924b64c0dbc4"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.106667 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-f6wpw" event={"ID":"f28004e7-4e00-4ffc-ae3a-cfad4022387a","Type":"ContainerStarted","Data":"5545350899d7efdf61a40563ca18bb0acaefe8e782631c778ca787b876350a84"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.107834 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-f6wpw" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.109383 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-pmhs8" event={"ID":"b7681632-31dc-4278-889a-8b89ddccac74","Type":"ContainerStarted","Data":"dc739558a4318f99049fb8aa984409f3c8c0efa1a493fa0f9808560ff1595d01"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.109408 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-pmhs8" event={"ID":"b7681632-31dc-4278-889a-8b89ddccac74","Type":"ContainerStarted","Data":"3d54e2b77d5e70913835c1b8a3e4904475ab86b57ee72506d06107f7023bc762"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.109957 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-pmhs8" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.114292 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-mh5sx" event={"ID":"845f41f1-b619-4984-aa2a-bae46992d463","Type":"ContainerStarted","Data":"3ddbd2996a169243cf529099346efb54bdca1693f7b4adf77ff89fcf095850df"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.114714 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-mh5sx" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.119583 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-wdw6b" event={"ID":"086685ca-996d-44d9-bd02-33cc99e5dab9","Type":"ContainerStarted","Data":"ecd9f89bbe18a2e1309c9398fc774a768111d35bb94ca608a0c5ce21a254b31e"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.120059 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-wdw6b" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.136978 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-f6wpw" podStartSLOduration=4.741500748 podStartE2EDuration="15.136965728s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.747783476 +0000 UTC m=+820.651259054" lastFinishedPulling="2025-10-01 12:51:39.143248436 +0000 UTC m=+831.046724034" observedRunningTime="2025-10-01 12:51:41.135279652 +0000 UTC m=+833.038755250" watchObservedRunningTime="2025-10-01 12:51:41.136965728 +0000 UTC m=+833.040441306" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.137350 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf" event={"ID":"5ac2ed71-cb90-4003-b8f9-5ad6748c08d5","Type":"ContainerStarted","Data":"00f6b9fcc32c8de216d32c7fbfa0a6ab1a5662b7691dd34f85ecc9eb4212d993"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.137537 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.161998 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-rxgj5" event={"ID":"55b36494-1091-44b0-b303-ac62e5cef841","Type":"ContainerStarted","Data":"6310c84e292ccd5ac32af2f3f77f1fe02df9e307368e33c4e6c98ad30aca6803"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.162278 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-rxgj5" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.167619 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" event={"ID":"f7389885-bb97-4d1d-8c55-9d3f76eda8ed","Type":"ContainerStarted","Data":"3450c156ecabbf8a8ef9a6a53cfacf89e7ea8f968ea6bef9991416b9a89b1e4d"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.167996 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.169809 4913 generic.go:334] "Generic (PLEG): container finished" podID="5dddf17f-397f-42c9-a202-94d68fe00c52" containerID="e95ad1039bb829453e5b1ead39da6b601c53aeeb79d1ba462092859666acd6e0" exitCode=0 Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.169859 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l945r" event={"ID":"5dddf17f-397f-42c9-a202-94d68fe00c52","Type":"ContainerDied","Data":"e95ad1039bb829453e5b1ead39da6b601c53aeeb79d1ba462092859666acd6e0"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.179100 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-mh5sx" podStartSLOduration=3.921232642 podStartE2EDuration="15.179083541s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:27.880402099 +0000 UTC m=+819.783877667" lastFinishedPulling="2025-10-01 12:51:39.138252978 +0000 UTC m=+831.041728566" observedRunningTime="2025-10-01 12:51:41.177733884 +0000 UTC m=+833.081209472" watchObservedRunningTime="2025-10-01 12:51:41.179083541 +0000 UTC m=+833.082559119" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.187804 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-pbjqc" event={"ID":"cb312f93-72ce-4d3e-9d89-66526e40dca2","Type":"ContainerStarted","Data":"da7effaa51a619953dd477aa886115a94ea08b34601bc5dcd3cbda3aa43473cb"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.188489 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-pbjqc" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.190045 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-2djx2" event={"ID":"1549da87-3151-467a-92d6-de0709a3a6a7","Type":"ContainerStarted","Data":"da98ad7bc70ee5959d0b519fcc9283b7b9099b58509ae3eb9d7fdf41ed60102d"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.190080 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-2djx2" event={"ID":"1549da87-3151-467a-92d6-de0709a3a6a7","Type":"ContainerStarted","Data":"e5fffc608b7b75002e60dfc58f14db6271f6aab035fb4698d547c9a13f17daf2"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.190740 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-2djx2" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.191821 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nsm78" event={"ID":"8f9b91c6-b4e9-44b3-83c2-42412d48de96","Type":"ContainerStarted","Data":"e5f0fdcb27b39c7659a1dc8715ab6342f19d15b8401ba2da8c3a0b16af04e70b"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.192156 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nsm78" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.203542 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-wdw6b" podStartSLOduration=3.628046658 podStartE2EDuration="15.203526836s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:27.562883973 +0000 UTC m=+819.466359551" lastFinishedPulling="2025-10-01 12:51:39.138364111 +0000 UTC m=+831.041839729" observedRunningTime="2025-10-01 12:51:41.202803096 +0000 UTC m=+833.106278694" watchObservedRunningTime="2025-10-01 12:51:41.203526836 +0000 UTC m=+833.107002404" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.206361 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-p6xjl" event={"ID":"9d418cfb-ec47-4ba7-b29b-5e68fddf11e4","Type":"ContainerStarted","Data":"b077dd231b3d8e7584f22c09a84a53d1821caaf60bd8a330d440396b3b0eeb15"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.207304 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-p6xjl" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.212887 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-pvvml" event={"ID":"61fd19a3-0b06-401a-9bf3-9d2a34bbf291","Type":"ContainerStarted","Data":"00411954d23b515a96b96c70decebf7777fa3e18d745bed3fb7ab6e3fe8342e1"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.213351 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-pvvml" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.219727 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-5jvz7" event={"ID":"13e74825-5720-4c49-97e9-a0fccf649b50","Type":"ContainerStarted","Data":"81138dd17764008cdaf7125aceb691379d601750325d4c52ba8d6993fcb84ce0"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.220126 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-5jvz7" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.221201 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-gq8sp" event={"ID":"d7b4ec90-a547-48a1-83ac-3528c53f90f0","Type":"ContainerStarted","Data":"c7c378a7dc252279b23ce7f6445f2544e13c7c786d0c9947779df413c7d0efb7"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.221424 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-gq8sp" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.224431 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-hdrq8" event={"ID":"5c79503f-3029-498b-b9b1-e2df43820cb2","Type":"ContainerStarted","Data":"d9c9b0c0dccf027aea4f1783fff41fded40babfd7bd9025f383591fd2490139b"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.225192 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-hdrq8" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.227375 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-s7zlm" event={"ID":"e5ec32c8-323f-4c74-bf82-4dc2a70db41a","Type":"ContainerStarted","Data":"344c54f0bc87673f0616460ad0a09f5b5cf9f8a2a640ff2808923faf345ffe13"} Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.227725 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-s7zlm" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.235417 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-pmhs8" podStartSLOduration=4.844926844 podStartE2EDuration="15.235404806s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.752464695 +0000 UTC m=+820.655940263" lastFinishedPulling="2025-10-01 12:51:39.142942647 +0000 UTC m=+831.046418225" observedRunningTime="2025-10-01 12:51:41.233436122 +0000 UTC m=+833.136911720" watchObservedRunningTime="2025-10-01 12:51:41.235404806 +0000 UTC m=+833.138880384" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.258772 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-pbjqc" podStartSLOduration=3.86101747 podStartE2EDuration="15.258757651s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:27.74025147 +0000 UTC m=+819.643727048" lastFinishedPulling="2025-10-01 12:51:39.137991641 +0000 UTC m=+831.041467229" observedRunningTime="2025-10-01 12:51:41.257478495 +0000 UTC m=+833.160954073" watchObservedRunningTime="2025-10-01 12:51:41.258757651 +0000 UTC m=+833.162233229" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.278960 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-rxgj5" podStartSLOduration=4.174042813 podStartE2EDuration="15.278943378s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.033223559 +0000 UTC m=+819.936699137" lastFinishedPulling="2025-10-01 12:51:39.138124124 +0000 UTC m=+831.041599702" observedRunningTime="2025-10-01 12:51:41.278595109 +0000 UTC m=+833.182070697" watchObservedRunningTime="2025-10-01 12:51:41.278943378 +0000 UTC m=+833.182418956" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.312255 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-p6xjl" podStartSLOduration=3.652265277 podStartE2EDuration="15.312240978s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:27.477911987 +0000 UTC m=+819.381387565" lastFinishedPulling="2025-10-01 12:51:39.137887698 +0000 UTC m=+831.041363266" observedRunningTime="2025-10-01 12:51:41.307342022 +0000 UTC m=+833.210817620" watchObservedRunningTime="2025-10-01 12:51:41.312240978 +0000 UTC m=+833.215716546" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.339407 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nsm78" podStartSLOduration=4.942844498 podStartE2EDuration="15.339392918s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.746820409 +0000 UTC m=+820.650295987" lastFinishedPulling="2025-10-01 12:51:39.143368829 +0000 UTC m=+831.046844407" observedRunningTime="2025-10-01 12:51:41.338649626 +0000 UTC m=+833.242125224" watchObservedRunningTime="2025-10-01 12:51:41.339392918 +0000 UTC m=+833.242868496" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.368335 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" podStartSLOduration=4.963001375 podStartE2EDuration="15.368318886s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.750852921 +0000 UTC m=+820.654328499" lastFinishedPulling="2025-10-01 12:51:39.156170432 +0000 UTC m=+831.059646010" observedRunningTime="2025-10-01 12:51:41.36521036 +0000 UTC m=+833.268685958" watchObservedRunningTime="2025-10-01 12:51:41.368318886 +0000 UTC m=+833.271794464" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.439955 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-gq8sp" podStartSLOduration=3.994879075 podStartE2EDuration="15.439927362s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:27.698197459 +0000 UTC m=+819.601673037" lastFinishedPulling="2025-10-01 12:51:39.143245746 +0000 UTC m=+831.046721324" observedRunningTime="2025-10-01 12:51:41.434116612 +0000 UTC m=+833.337592190" watchObservedRunningTime="2025-10-01 12:51:41.439927362 +0000 UTC m=+833.343402940" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.441732 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-hdrq8" podStartSLOduration=4.398726535 podStartE2EDuration="15.441726522s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.100254659 +0000 UTC m=+820.003730237" lastFinishedPulling="2025-10-01 12:51:39.143254636 +0000 UTC m=+831.046730224" observedRunningTime="2025-10-01 12:51:41.399134447 +0000 UTC m=+833.302610045" watchObservedRunningTime="2025-10-01 12:51:41.441726522 +0000 UTC m=+833.345202100" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.484277 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-pvvml" podStartSLOduration=5.084507659 podStartE2EDuration="15.484248976s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.754011518 +0000 UTC m=+820.657487096" lastFinishedPulling="2025-10-01 12:51:39.153752815 +0000 UTC m=+831.057228413" observedRunningTime="2025-10-01 12:51:41.480653817 +0000 UTC m=+833.384129415" watchObservedRunningTime="2025-10-01 12:51:41.484248976 +0000 UTC m=+833.387724554" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.512056 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-2djx2" podStartSLOduration=4.949596514 podStartE2EDuration="15.512041664s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.600014856 +0000 UTC m=+820.503490434" lastFinishedPulling="2025-10-01 12:51:39.162460016 +0000 UTC m=+831.065935584" observedRunningTime="2025-10-01 12:51:41.50684425 +0000 UTC m=+833.410319838" watchObservedRunningTime="2025-10-01 12:51:41.512041664 +0000 UTC m=+833.415517242" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.534332 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-s7zlm" podStartSLOduration=4.4851022 podStartE2EDuration="15.534303468s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.109940376 +0000 UTC m=+820.013415954" lastFinishedPulling="2025-10-01 12:51:39.159141644 +0000 UTC m=+831.062617222" observedRunningTime="2025-10-01 12:51:41.52892941 +0000 UTC m=+833.432404978" watchObservedRunningTime="2025-10-01 12:51:41.534303468 +0000 UTC m=+833.437779046" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.550952 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-5jvz7" podStartSLOduration=4.503085968 podStartE2EDuration="15.550937688s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.092346551 +0000 UTC m=+819.995822129" lastFinishedPulling="2025-10-01 12:51:39.140198241 +0000 UTC m=+831.043673849" observedRunningTime="2025-10-01 12:51:41.549443387 +0000 UTC m=+833.452918965" watchObservedRunningTime="2025-10-01 12:51:41.550937688 +0000 UTC m=+833.454413266" Oct 01 12:51:41 crc kubenswrapper[4913]: I1001 12:51:41.567962 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf" podStartSLOduration=4.33154082 podStartE2EDuration="15.567948127s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:27.903844626 +0000 UTC m=+819.807320204" lastFinishedPulling="2025-10-01 12:51:39.140251893 +0000 UTC m=+831.043727511" observedRunningTime="2025-10-01 12:51:41.564226435 +0000 UTC m=+833.467702023" watchObservedRunningTime="2025-10-01 12:51:41.567948127 +0000 UTC m=+833.471423705" Oct 01 12:51:42 crc kubenswrapper[4913]: I1001 12:51:42.235282 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l945r" event={"ID":"5dddf17f-397f-42c9-a202-94d68fe00c52","Type":"ContainerStarted","Data":"fe88112b195abf79bdfeea8d4877ee930e510d7cbfd1a8274174dbe133b7f4c2"} Oct 01 12:51:43 crc kubenswrapper[4913]: I1001 12:51:43.245089 4913 generic.go:334] "Generic (PLEG): container finished" podID="5dddf17f-397f-42c9-a202-94d68fe00c52" containerID="fe88112b195abf79bdfeea8d4877ee930e510d7cbfd1a8274174dbe133b7f4c2" exitCode=0 Oct 01 12:51:43 crc kubenswrapper[4913]: I1001 12:51:43.245206 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l945r" event={"ID":"5dddf17f-397f-42c9-a202-94d68fe00c52","Type":"ContainerDied","Data":"fe88112b195abf79bdfeea8d4877ee930e510d7cbfd1a8274174dbe133b7f4c2"} Oct 01 12:51:44 crc kubenswrapper[4913]: I1001 12:51:44.266046 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l945r" event={"ID":"5dddf17f-397f-42c9-a202-94d68fe00c52","Type":"ContainerStarted","Data":"b60e103990117ee98da23e63d2bb7d657ed589e5de702dc09c841d751417b124"} Oct 01 12:51:44 crc kubenswrapper[4913]: I1001 12:51:44.284905 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l945r" podStartSLOduration=6.743968557 podStartE2EDuration="9.284888857s" podCreationTimestamp="2025-10-01 12:51:35 +0000 UTC" firstStartedPulling="2025-10-01 12:51:41.170761751 +0000 UTC m=+833.074237329" lastFinishedPulling="2025-10-01 12:51:43.711682061 +0000 UTC m=+835.615157629" observedRunningTime="2025-10-01 12:51:44.28136731 +0000 UTC m=+836.184842898" watchObservedRunningTime="2025-10-01 12:51:44.284888857 +0000 UTC m=+836.188364445" Oct 01 12:51:45 crc kubenswrapper[4913]: I1001 12:51:45.491916 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:45 crc kubenswrapper[4913]: I1001 12:51:45.492207 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.289632 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" event={"ID":"3f17e8a8-3251-4a03-8f8d-70698d5146b3","Type":"ContainerStarted","Data":"38a53254edd10c9227ab6fa8d959c04c8ba505e4d7baaf78fe5949f00bed5b7f"} Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.289868 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.291802 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7" event={"ID":"e80ad896-dd86-43e6-850b-f12088a61cf5","Type":"ContainerStarted","Data":"b2da28546aa2049a2689a478b6d261fce29b5f1d7d992008377c753f08f678f9"} Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.292453 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7" Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.297323 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d" event={"ID":"a19e1891-5907-42dc-9894-6c9b3bcb5cce","Type":"ContainerStarted","Data":"13bd230e81d558e42c61a11a94eb0a9577603cfe7400d2057b4d5d1365b7b843"} Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.297844 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d" Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.318531 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" podStartSLOduration=3.25069623 podStartE2EDuration="20.318512461s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.82942984 +0000 UTC m=+820.732905418" lastFinishedPulling="2025-10-01 12:51:45.897246061 +0000 UTC m=+837.800721649" observedRunningTime="2025-10-01 12:51:46.315408866 +0000 UTC m=+838.218884484" watchObservedRunningTime="2025-10-01 12:51:46.318512461 +0000 UTC m=+838.221988049" Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.365379 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d" podStartSLOduration=3.297267946 podStartE2EDuration="20.365363995s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.829229015 +0000 UTC m=+820.732704593" lastFinishedPulling="2025-10-01 12:51:45.897325054 +0000 UTC m=+837.800800642" observedRunningTime="2025-10-01 12:51:46.361678833 +0000 UTC m=+838.265154601" watchObservedRunningTime="2025-10-01 12:51:46.365363995 +0000 UTC m=+838.268839573" Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.380810 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7" podStartSLOduration=3.191601399 podStartE2EDuration="20.38079131s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.784462759 +0000 UTC m=+820.687938337" lastFinishedPulling="2025-10-01 12:51:45.97365267 +0000 UTC m=+837.877128248" observedRunningTime="2025-10-01 12:51:46.376973075 +0000 UTC m=+838.280448693" watchObservedRunningTime="2025-10-01 12:51:46.38079131 +0000 UTC m=+838.284266888" Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.555949 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-l945r" podUID="5dddf17f-397f-42c9-a202-94d68fe00c52" containerName="registry-server" probeResult="failure" output=< Oct 01 12:51:46 crc kubenswrapper[4913]: timeout: failed to connect service ":50051" within 1s Oct 01 12:51:46 crc kubenswrapper[4913]: > Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.815772 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-p6xjl" Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.862475 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-wdw6b" Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.877781 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-pbjqc" Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.905681 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-gq8sp" Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.939529 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-rxgj5" Oct 01 12:51:46 crc kubenswrapper[4913]: I1001 12:51:46.985478 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-mh5sx" Oct 01 12:51:47 crc kubenswrapper[4913]: I1001 12:51:47.051884 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-2ztjf" Oct 01 12:51:47 crc kubenswrapper[4913]: I1001 12:51:47.059986 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-hdrq8" Oct 01 12:51:47 crc kubenswrapper[4913]: I1001 12:51:47.092437 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-s7zlm" Oct 01 12:51:47 crc kubenswrapper[4913]: I1001 12:51:47.202096 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-5jvz7" Oct 01 12:51:47 crc kubenswrapper[4913]: I1001 12:51:47.239997 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nsm78" Oct 01 12:51:47 crc kubenswrapper[4913]: I1001 12:51:47.240648 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-pmhs8" Oct 01 12:51:47 crc kubenswrapper[4913]: I1001 12:51:47.273842 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-f6wpw" Oct 01 12:51:47 crc kubenswrapper[4913]: I1001 12:51:47.337725 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-pvvml" Oct 01 12:51:47 crc kubenswrapper[4913]: I1001 12:51:47.374553 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-2djx2" Oct 01 12:51:47 crc kubenswrapper[4913]: I1001 12:51:47.450156 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" Oct 01 12:51:49 crc kubenswrapper[4913]: I1001 12:51:49.338634 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5" event={"ID":"b0d4dbf5-aa13-46ec-ab24-5c43a0be638c","Type":"ContainerStarted","Data":"bc03bd9a3b1558e90334f77dc7a3a6973008049b7f6a7b768e93d4592932a0a1"} Oct 01 12:51:49 crc kubenswrapper[4913]: I1001 12:51:49.341518 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w" event={"ID":"e0ef9126-ad52-4803-b11d-f8b1712c4efd","Type":"ContainerStarted","Data":"d418c51dbc6a832ab8e5e484b5b75a0f02c07296d642ccd93509ba2a43197365"} Oct 01 12:51:49 crc kubenswrapper[4913]: I1001 12:51:49.342076 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w" Oct 01 12:51:49 crc kubenswrapper[4913]: I1001 12:51:49.344361 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m" event={"ID":"a34a6147-f982-48cf-9976-b39c4dd420cf","Type":"ContainerStarted","Data":"c5aa20852a45bf131f27489f2973b9f242b39a2b1cdf0281cebbab24ca231768"} Oct 01 12:51:49 crc kubenswrapper[4913]: I1001 12:51:49.344744 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m" Oct 01 12:51:49 crc kubenswrapper[4913]: I1001 12:51:49.359563 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5" podStartSLOduration=2.2581108739999998 podStartE2EDuration="22.359538246s" podCreationTimestamp="2025-10-01 12:51:27 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.802963329 +0000 UTC m=+820.706438907" lastFinishedPulling="2025-10-01 12:51:48.904390701 +0000 UTC m=+840.807866279" observedRunningTime="2025-10-01 12:51:49.352148343 +0000 UTC m=+841.255623931" watchObservedRunningTime="2025-10-01 12:51:49.359538246 +0000 UTC m=+841.263013844" Oct 01 12:51:49 crc kubenswrapper[4913]: I1001 12:51:49.375441 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m" podStartSLOduration=3.244073198 podStartE2EDuration="23.375417075s" podCreationTimestamp="2025-10-01 12:51:26 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.784423588 +0000 UTC m=+820.687899166" lastFinishedPulling="2025-10-01 12:51:48.915767465 +0000 UTC m=+840.819243043" observedRunningTime="2025-10-01 12:51:49.366196331 +0000 UTC m=+841.269671929" watchObservedRunningTime="2025-10-01 12:51:49.375417075 +0000 UTC m=+841.278892653" Oct 01 12:51:49 crc kubenswrapper[4913]: I1001 12:51:49.406522 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w" podStartSLOduration=2.293406949 podStartE2EDuration="22.406503723s" podCreationTimestamp="2025-10-01 12:51:27 +0000 UTC" firstStartedPulling="2025-10-01 12:51:28.787549674 +0000 UTC m=+820.691025252" lastFinishedPulling="2025-10-01 12:51:48.900646458 +0000 UTC m=+840.804122026" observedRunningTime="2025-10-01 12:51:49.405621529 +0000 UTC m=+841.309097127" watchObservedRunningTime="2025-10-01 12:51:49.406503723 +0000 UTC m=+841.309979291" Oct 01 12:51:55 crc kubenswrapper[4913]: I1001 12:51:55.572337 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:55 crc kubenswrapper[4913]: I1001 12:51:55.641607 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:55 crc kubenswrapper[4913]: I1001 12:51:55.815570 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l945r"] Oct 01 12:51:57 crc kubenswrapper[4913]: I1001 12:51:57.217147 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-f8tw7" Oct 01 12:51:57 crc kubenswrapper[4913]: I1001 12:51:57.233341 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-45w5m" Oct 01 12:51:57 crc kubenswrapper[4913]: I1001 12:51:57.356172 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-8668d" Oct 01 12:51:57 crc kubenswrapper[4913]: I1001 12:51:57.412008 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l945r" podUID="5dddf17f-397f-42c9-a202-94d68fe00c52" containerName="registry-server" containerID="cri-o://b60e103990117ee98da23e63d2bb7d657ed589e5de702dc09c841d751417b124" gracePeriod=2 Oct 01 12:51:57 crc kubenswrapper[4913]: I1001 12:51:57.545370 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-4g45w" Oct 01 12:51:57 crc kubenswrapper[4913]: I1001 12:51:57.909412 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-659bb84579hvlsl" Oct 01 12:51:58 crc kubenswrapper[4913]: I1001 12:51:58.420872 4913 generic.go:334] "Generic (PLEG): container finished" podID="5dddf17f-397f-42c9-a202-94d68fe00c52" containerID="b60e103990117ee98da23e63d2bb7d657ed589e5de702dc09c841d751417b124" exitCode=0 Oct 01 12:51:58 crc kubenswrapper[4913]: I1001 12:51:58.420952 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l945r" event={"ID":"5dddf17f-397f-42c9-a202-94d68fe00c52","Type":"ContainerDied","Data":"b60e103990117ee98da23e63d2bb7d657ed589e5de702dc09c841d751417b124"} Oct 01 12:51:58 crc kubenswrapper[4913]: I1001 12:51:58.761578 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:58 crc kubenswrapper[4913]: I1001 12:51:58.942832 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vj6b\" (UniqueName: \"kubernetes.io/projected/5dddf17f-397f-42c9-a202-94d68fe00c52-kube-api-access-8vj6b\") pod \"5dddf17f-397f-42c9-a202-94d68fe00c52\" (UID: \"5dddf17f-397f-42c9-a202-94d68fe00c52\") " Oct 01 12:51:58 crc kubenswrapper[4913]: I1001 12:51:58.943012 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dddf17f-397f-42c9-a202-94d68fe00c52-catalog-content\") pod \"5dddf17f-397f-42c9-a202-94d68fe00c52\" (UID: \"5dddf17f-397f-42c9-a202-94d68fe00c52\") " Oct 01 12:51:58 crc kubenswrapper[4913]: I1001 12:51:58.943049 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dddf17f-397f-42c9-a202-94d68fe00c52-utilities\") pod \"5dddf17f-397f-42c9-a202-94d68fe00c52\" (UID: \"5dddf17f-397f-42c9-a202-94d68fe00c52\") " Oct 01 12:51:58 crc kubenswrapper[4913]: I1001 12:51:58.944518 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dddf17f-397f-42c9-a202-94d68fe00c52-utilities" (OuterVolumeSpecName: "utilities") pod "5dddf17f-397f-42c9-a202-94d68fe00c52" (UID: "5dddf17f-397f-42c9-a202-94d68fe00c52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:51:58 crc kubenswrapper[4913]: I1001 12:51:58.948722 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dddf17f-397f-42c9-a202-94d68fe00c52-kube-api-access-8vj6b" (OuterVolumeSpecName: "kube-api-access-8vj6b") pod "5dddf17f-397f-42c9-a202-94d68fe00c52" (UID: "5dddf17f-397f-42c9-a202-94d68fe00c52"). InnerVolumeSpecName "kube-api-access-8vj6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:51:58 crc kubenswrapper[4913]: I1001 12:51:58.987716 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dddf17f-397f-42c9-a202-94d68fe00c52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5dddf17f-397f-42c9-a202-94d68fe00c52" (UID: "5dddf17f-397f-42c9-a202-94d68fe00c52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:51:59 crc kubenswrapper[4913]: I1001 12:51:59.044538 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dddf17f-397f-42c9-a202-94d68fe00c52-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:59 crc kubenswrapper[4913]: I1001 12:51:59.044568 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dddf17f-397f-42c9-a202-94d68fe00c52-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:59 crc kubenswrapper[4913]: I1001 12:51:59.044580 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vj6b\" (UniqueName: \"kubernetes.io/projected/5dddf17f-397f-42c9-a202-94d68fe00c52-kube-api-access-8vj6b\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:59 crc kubenswrapper[4913]: I1001 12:51:59.431886 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l945r" event={"ID":"5dddf17f-397f-42c9-a202-94d68fe00c52","Type":"ContainerDied","Data":"618aee65f361fb6093d4b89773dde77cf89fbd471d3f1cda731410fc4f3e4d87"} Oct 01 12:51:59 crc kubenswrapper[4913]: I1001 12:51:59.432146 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l945r" Oct 01 12:51:59 crc kubenswrapper[4913]: I1001 12:51:59.432187 4913 scope.go:117] "RemoveContainer" containerID="b60e103990117ee98da23e63d2bb7d657ed589e5de702dc09c841d751417b124" Oct 01 12:51:59 crc kubenswrapper[4913]: I1001 12:51:59.465625 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l945r"] Oct 01 12:51:59 crc kubenswrapper[4913]: I1001 12:51:59.468290 4913 scope.go:117] "RemoveContainer" containerID="fe88112b195abf79bdfeea8d4877ee930e510d7cbfd1a8274174dbe133b7f4c2" Oct 01 12:51:59 crc kubenswrapper[4913]: I1001 12:51:59.472075 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l945r"] Oct 01 12:51:59 crc kubenswrapper[4913]: I1001 12:51:59.499932 4913 scope.go:117] "RemoveContainer" containerID="e95ad1039bb829453e5b1ead39da6b601c53aeeb79d1ba462092859666acd6e0" Oct 01 12:52:00 crc kubenswrapper[4913]: I1001 12:52:00.816395 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dddf17f-397f-42c9-a202-94d68fe00c52" path="/var/lib/kubelet/pods/5dddf17f-397f-42c9-a202-94d68fe00c52/volumes" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.262066 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8b69cf79-hlmd2"] Oct 01 12:52:13 crc kubenswrapper[4913]: E1001 12:52:13.262847 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dddf17f-397f-42c9-a202-94d68fe00c52" containerName="extract-utilities" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.262860 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dddf17f-397f-42c9-a202-94d68fe00c52" containerName="extract-utilities" Oct 01 12:52:13 crc kubenswrapper[4913]: E1001 12:52:13.262894 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dddf17f-397f-42c9-a202-94d68fe00c52" containerName="extract-content" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.262901 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dddf17f-397f-42c9-a202-94d68fe00c52" containerName="extract-content" Oct 01 12:52:13 crc kubenswrapper[4913]: E1001 12:52:13.262934 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dddf17f-397f-42c9-a202-94d68fe00c52" containerName="registry-server" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.262940 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dddf17f-397f-42c9-a202-94d68fe00c52" containerName="registry-server" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.263083 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dddf17f-397f-42c9-a202-94d68fe00c52" containerName="registry-server" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.263791 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8b69cf79-hlmd2" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.265311 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7dqpd" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.278358 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.278686 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.278885 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.287207 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8b69cf79-hlmd2"] Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.349861 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-876tw\" (UniqueName: \"kubernetes.io/projected/d9845757-12e6-42c3-a6e2-89d88cca7bb7-kube-api-access-876tw\") pod \"dnsmasq-dns-b8b69cf79-hlmd2\" (UID: \"d9845757-12e6-42c3-a6e2-89d88cca7bb7\") " pod="openstack/dnsmasq-dns-b8b69cf79-hlmd2" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.349917 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9845757-12e6-42c3-a6e2-89d88cca7bb7-config\") pod \"dnsmasq-dns-b8b69cf79-hlmd2\" (UID: \"d9845757-12e6-42c3-a6e2-89d88cca7bb7\") " pod="openstack/dnsmasq-dns-b8b69cf79-hlmd2" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.363549 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-j56vm"] Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.364922 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.366300 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.381468 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-j56vm"] Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.451639 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmxl2\" (UniqueName: \"kubernetes.io/projected/1e122b22-fdd9-485f-92d7-e105f64e8a2f-kube-api-access-xmxl2\") pod \"dnsmasq-dns-d5f6f49c7-j56vm\" (UID: \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.451697 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e122b22-fdd9-485f-92d7-e105f64e8a2f-config\") pod \"dnsmasq-dns-d5f6f49c7-j56vm\" (UID: \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.451782 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e122b22-fdd9-485f-92d7-e105f64e8a2f-dns-svc\") pod \"dnsmasq-dns-d5f6f49c7-j56vm\" (UID: \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.451853 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-876tw\" (UniqueName: \"kubernetes.io/projected/d9845757-12e6-42c3-a6e2-89d88cca7bb7-kube-api-access-876tw\") pod \"dnsmasq-dns-b8b69cf79-hlmd2\" (UID: \"d9845757-12e6-42c3-a6e2-89d88cca7bb7\") " pod="openstack/dnsmasq-dns-b8b69cf79-hlmd2" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.451891 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9845757-12e6-42c3-a6e2-89d88cca7bb7-config\") pod \"dnsmasq-dns-b8b69cf79-hlmd2\" (UID: \"d9845757-12e6-42c3-a6e2-89d88cca7bb7\") " pod="openstack/dnsmasq-dns-b8b69cf79-hlmd2" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.452894 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9845757-12e6-42c3-a6e2-89d88cca7bb7-config\") pod \"dnsmasq-dns-b8b69cf79-hlmd2\" (UID: \"d9845757-12e6-42c3-a6e2-89d88cca7bb7\") " pod="openstack/dnsmasq-dns-b8b69cf79-hlmd2" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.475389 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-876tw\" (UniqueName: \"kubernetes.io/projected/d9845757-12e6-42c3-a6e2-89d88cca7bb7-kube-api-access-876tw\") pod \"dnsmasq-dns-b8b69cf79-hlmd2\" (UID: \"d9845757-12e6-42c3-a6e2-89d88cca7bb7\") " pod="openstack/dnsmasq-dns-b8b69cf79-hlmd2" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.553196 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e122b22-fdd9-485f-92d7-e105f64e8a2f-dns-svc\") pod \"dnsmasq-dns-d5f6f49c7-j56vm\" (UID: \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.553339 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmxl2\" (UniqueName: \"kubernetes.io/projected/1e122b22-fdd9-485f-92d7-e105f64e8a2f-kube-api-access-xmxl2\") pod \"dnsmasq-dns-d5f6f49c7-j56vm\" (UID: \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.553364 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e122b22-fdd9-485f-92d7-e105f64e8a2f-config\") pod \"dnsmasq-dns-d5f6f49c7-j56vm\" (UID: \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.554013 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e122b22-fdd9-485f-92d7-e105f64e8a2f-dns-svc\") pod \"dnsmasq-dns-d5f6f49c7-j56vm\" (UID: \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.554136 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e122b22-fdd9-485f-92d7-e105f64e8a2f-config\") pod \"dnsmasq-dns-d5f6f49c7-j56vm\" (UID: \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.568615 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmxl2\" (UniqueName: \"kubernetes.io/projected/1e122b22-fdd9-485f-92d7-e105f64e8a2f-kube-api-access-xmxl2\") pod \"dnsmasq-dns-d5f6f49c7-j56vm\" (UID: \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.585809 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8b69cf79-hlmd2" Oct 01 12:52:13 crc kubenswrapper[4913]: I1001 12:52:13.700017 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" Oct 01 12:52:14 crc kubenswrapper[4913]: I1001 12:52:14.041533 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8b69cf79-hlmd2"] Oct 01 12:52:14 crc kubenswrapper[4913]: W1001 12:52:14.042734 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9845757_12e6_42c3_a6e2_89d88cca7bb7.slice/crio-672984cc48bae814a210b44682e9108d2781e10abd187b43ee081bd215197d5f WatchSource:0}: Error finding container 672984cc48bae814a210b44682e9108d2781e10abd187b43ee081bd215197d5f: Status 404 returned error can't find the container with id 672984cc48bae814a210b44682e9108d2781e10abd187b43ee081bd215197d5f Oct 01 12:52:14 crc kubenswrapper[4913]: I1001 12:52:14.045019 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:52:14 crc kubenswrapper[4913]: I1001 12:52:14.129134 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-j56vm"] Oct 01 12:52:14 crc kubenswrapper[4913]: W1001 12:52:14.131944 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e122b22_fdd9_485f_92d7_e105f64e8a2f.slice/crio-a27f3e95f04d0aa47cccd44898ca9add9489f024bc386012aa0c28263d71dd97 WatchSource:0}: Error finding container a27f3e95f04d0aa47cccd44898ca9add9489f024bc386012aa0c28263d71dd97: Status 404 returned error can't find the container with id a27f3e95f04d0aa47cccd44898ca9add9489f024bc386012aa0c28263d71dd97 Oct 01 12:52:14 crc kubenswrapper[4913]: I1001 12:52:14.558320 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" event={"ID":"1e122b22-fdd9-485f-92d7-e105f64e8a2f","Type":"ContainerStarted","Data":"a27f3e95f04d0aa47cccd44898ca9add9489f024bc386012aa0c28263d71dd97"} Oct 01 12:52:14 crc kubenswrapper[4913]: I1001 12:52:14.560428 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8b69cf79-hlmd2" event={"ID":"d9845757-12e6-42c3-a6e2-89d88cca7bb7","Type":"ContainerStarted","Data":"672984cc48bae814a210b44682e9108d2781e10abd187b43ee081bd215197d5f"} Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.278490 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8b69cf79-hlmd2"] Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.306437 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-629ss"] Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.308013 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.321395 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-629ss"] Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.398306 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c80e2a0f-0343-481e-8939-a9c3c861c50c-dns-svc\") pod \"dnsmasq-dns-b6f94bdfc-629ss\" (UID: \"c80e2a0f-0343-481e-8939-a9c3c861c50c\") " pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.398375 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqplc\" (UniqueName: \"kubernetes.io/projected/c80e2a0f-0343-481e-8939-a9c3c861c50c-kube-api-access-rqplc\") pod \"dnsmasq-dns-b6f94bdfc-629ss\" (UID: \"c80e2a0f-0343-481e-8939-a9c3c861c50c\") " pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.398437 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c80e2a0f-0343-481e-8939-a9c3c861c50c-config\") pod \"dnsmasq-dns-b6f94bdfc-629ss\" (UID: \"c80e2a0f-0343-481e-8939-a9c3c861c50c\") " pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.500073 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c80e2a0f-0343-481e-8939-a9c3c861c50c-dns-svc\") pod \"dnsmasq-dns-b6f94bdfc-629ss\" (UID: \"c80e2a0f-0343-481e-8939-a9c3c861c50c\") " pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.500127 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqplc\" (UniqueName: \"kubernetes.io/projected/c80e2a0f-0343-481e-8939-a9c3c861c50c-kube-api-access-rqplc\") pod \"dnsmasq-dns-b6f94bdfc-629ss\" (UID: \"c80e2a0f-0343-481e-8939-a9c3c861c50c\") " pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.500190 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c80e2a0f-0343-481e-8939-a9c3c861c50c-config\") pod \"dnsmasq-dns-b6f94bdfc-629ss\" (UID: \"c80e2a0f-0343-481e-8939-a9c3c861c50c\") " pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.501093 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c80e2a0f-0343-481e-8939-a9c3c861c50c-config\") pod \"dnsmasq-dns-b6f94bdfc-629ss\" (UID: \"c80e2a0f-0343-481e-8939-a9c3c861c50c\") " pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.501228 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c80e2a0f-0343-481e-8939-a9c3c861c50c-dns-svc\") pod \"dnsmasq-dns-b6f94bdfc-629ss\" (UID: \"c80e2a0f-0343-481e-8939-a9c3c861c50c\") " pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.533166 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqplc\" (UniqueName: \"kubernetes.io/projected/c80e2a0f-0343-481e-8939-a9c3c861c50c-kube-api-access-rqplc\") pod \"dnsmasq-dns-b6f94bdfc-629ss\" (UID: \"c80e2a0f-0343-481e-8939-a9c3c861c50c\") " pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.621441 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-j56vm"] Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.638527 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.641559 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-rksl6"] Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.642791 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.654342 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-rksl6"] Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.803099 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-config\") pod \"dnsmasq-dns-77795d58f5-rksl6\" (UID: \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\") " pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.803398 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-dns-svc\") pod \"dnsmasq-dns-77795d58f5-rksl6\" (UID: \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\") " pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.803454 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4kbl\" (UniqueName: \"kubernetes.io/projected/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-kube-api-access-h4kbl\") pod \"dnsmasq-dns-77795d58f5-rksl6\" (UID: \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\") " pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.907586 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-config\") pod \"dnsmasq-dns-77795d58f5-rksl6\" (UID: \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\") " pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.907655 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-dns-svc\") pod \"dnsmasq-dns-77795d58f5-rksl6\" (UID: \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\") " pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.907729 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4kbl\" (UniqueName: \"kubernetes.io/projected/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-kube-api-access-h4kbl\") pod \"dnsmasq-dns-77795d58f5-rksl6\" (UID: \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\") " pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.909940 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-dns-svc\") pod \"dnsmasq-dns-77795d58f5-rksl6\" (UID: \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\") " pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.910660 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-config\") pod \"dnsmasq-dns-77795d58f5-rksl6\" (UID: \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\") " pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.948352 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4kbl\" (UniqueName: \"kubernetes.io/projected/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-kube-api-access-h4kbl\") pod \"dnsmasq-dns-77795d58f5-rksl6\" (UID: \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\") " pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:52:16 crc kubenswrapper[4913]: I1001 12:52:16.979855 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.182107 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-629ss"] Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.461484 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-rksl6"] Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.489167 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.490412 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.492622 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.492809 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.493246 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.493644 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.493759 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.496950 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.499117 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jlsgv" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.505932 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.627987 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.628027 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-server-conf\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.628093 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-config-data\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.628136 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.628170 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19a7d9a6-e81a-483a-8408-6784ded67834-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.628217 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfhhk\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-kube-api-access-zfhhk\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.628252 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.628341 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.628373 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.628430 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19a7d9a6-e81a-483a-8408-6784ded67834-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.628456 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.729607 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.729656 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-config-data\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.729678 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-server-conf\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.729710 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.729754 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19a7d9a6-e81a-483a-8408-6784ded67834-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.729784 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfhhk\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-kube-api-access-zfhhk\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.729818 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.729841 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.729863 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.729888 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19a7d9a6-e81a-483a-8408-6784ded67834-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.729907 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.730695 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-config-data\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.730726 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.730850 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.731173 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.732009 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.732787 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-server-conf\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.736253 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.739590 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19a7d9a6-e81a-483a-8408-6784ded67834-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.745591 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19a7d9a6-e81a-483a-8408-6784ded67834-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.747184 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfhhk\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-kube-api-access-zfhhk\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.751692 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.756958 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.773576 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.775702 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.779069 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.779488 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.779606 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.779744 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.779841 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-65sv7" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.779931 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.780015 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.792021 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.825139 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.932404 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.932444 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.932473 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.932494 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vctsw\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-kube-api-access-vctsw\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.932524 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.932747 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.932814 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.932841 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.932868 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.932883 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:17 crc kubenswrapper[4913]: I1001 12:52:17.932915 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.034166 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vctsw\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-kube-api-access-vctsw\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.034224 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.034283 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.034309 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.034325 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.034341 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.034364 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.034387 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.034418 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.034436 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.034458 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.034856 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.035116 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.035784 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.035801 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.035919 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.036239 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.039853 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.041083 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.044941 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.050522 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.055030 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vctsw\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-kube-api-access-vctsw\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.058712 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:18 crc kubenswrapper[4913]: I1001 12:52:18.118939 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:19 crc kubenswrapper[4913]: I1001 12:52:19.608734 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" event={"ID":"c80e2a0f-0343-481e-8939-a9c3c861c50c","Type":"ContainerStarted","Data":"0b3457e93016f76750d1818bdf16cdf9ac9ab6f3dafbe0f1de430f70ce0cd7fc"} Oct 01 12:52:20 crc kubenswrapper[4913]: W1001 12:52:20.017952 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e2fa990_1cf6_4826_9bbd_0e02bf405bcb.slice/crio-3eeccee84476b090b7664ed40be9bfc831cd51bf6e621295ba4f48a531023e06 WatchSource:0}: Error finding container 3eeccee84476b090b7664ed40be9bfc831cd51bf6e621295ba4f48a531023e06: Status 404 returned error can't find the container with id 3eeccee84476b090b7664ed40be9bfc831cd51bf6e621295ba4f48a531023e06 Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.465211 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.467850 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.471155 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.471291 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.471385 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-9znwx" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.471716 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.471791 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.483628 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.485328 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.504666 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.507886 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.509526 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-ftvvl" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.511434 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.511853 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.515214 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.526038 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.572507 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-secrets\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.572560 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-config-data-default\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.572589 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.572606 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-kolla-config\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.572629 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.572647 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhm2m\" (UniqueName: \"kubernetes.io/projected/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-kube-api-access-mhm2m\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.572665 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.572686 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.572709 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.620218 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-rksl6" event={"ID":"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb","Type":"ContainerStarted","Data":"3eeccee84476b090b7664ed40be9bfc831cd51bf6e621295ba4f48a531023e06"} Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.673472 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.673515 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.673541 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-secrets\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.673616 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-config-data-default\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.673673 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.673706 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.673755 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.673789 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-kolla-config\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.673835 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.673871 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhm2m\" (UniqueName: \"kubernetes.io/projected/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-kube-api-access-mhm2m\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.673906 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.673946 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.673981 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.674025 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.674064 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwcgv\" (UniqueName: \"kubernetes.io/projected/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-kube-api-access-kwcgv\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.674099 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.674134 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.674166 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.674502 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.675056 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-kolla-config\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.675628 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-config-data-default\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.676030 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.679549 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.685073 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-secrets\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.689515 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.700371 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhm2m\" (UniqueName: \"kubernetes.io/projected/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-kube-api-access-mhm2m\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.701584 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9cea593-9b1f-44f0-8f3a-831b4b0ee98d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.724986 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d\") " pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.776829 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.776897 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwcgv\" (UniqueName: \"kubernetes.io/projected/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-kube-api-access-kwcgv\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.776923 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.776947 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.776970 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.777017 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.777041 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.777084 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.777105 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.777572 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.777727 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.778617 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.779384 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.786693 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.790308 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.790998 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.794215 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.795641 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.822162 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwcgv\" (UniqueName: \"kubernetes.io/projected/fe74cb0a-552b-42dd-a1af-acb58e98b7dd-kube-api-access-kwcgv\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.825769 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fe74cb0a-552b-42dd-a1af-acb58e98b7dd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.884223 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.885341 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.887543 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.887674 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.887746 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-bhtfc" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.889339 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.979479 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lj47\" (UniqueName: \"kubernetes.io/projected/0cfc30d4-ee8f-4491-a558-cb067b0dec39-kube-api-access-2lj47\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.979587 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0cfc30d4-ee8f-4491-a558-cb067b0dec39-kolla-config\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.979643 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cfc30d4-ee8f-4491-a558-cb067b0dec39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.979669 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cfc30d4-ee8f-4491-a558-cb067b0dec39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:20 crc kubenswrapper[4913]: I1001 12:52:20.979728 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0cfc30d4-ee8f-4491-a558-cb067b0dec39-config-data\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:21 crc kubenswrapper[4913]: I1001 12:52:21.080536 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cfc30d4-ee8f-4491-a558-cb067b0dec39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:21 crc kubenswrapper[4913]: I1001 12:52:21.080773 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cfc30d4-ee8f-4491-a558-cb067b0dec39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:21 crc kubenswrapper[4913]: I1001 12:52:21.080906 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0cfc30d4-ee8f-4491-a558-cb067b0dec39-config-data\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:21 crc kubenswrapper[4913]: I1001 12:52:21.080984 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lj47\" (UniqueName: \"kubernetes.io/projected/0cfc30d4-ee8f-4491-a558-cb067b0dec39-kube-api-access-2lj47\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:21 crc kubenswrapper[4913]: I1001 12:52:21.081106 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0cfc30d4-ee8f-4491-a558-cb067b0dec39-kolla-config\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:21 crc kubenswrapper[4913]: I1001 12:52:21.081820 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0cfc30d4-ee8f-4491-a558-cb067b0dec39-kolla-config\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:21 crc kubenswrapper[4913]: I1001 12:52:21.082318 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0cfc30d4-ee8f-4491-a558-cb067b0dec39-config-data\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:21 crc kubenswrapper[4913]: I1001 12:52:21.085228 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cfc30d4-ee8f-4491-a558-cb067b0dec39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:21 crc kubenswrapper[4913]: I1001 12:52:21.085405 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cfc30d4-ee8f-4491-a558-cb067b0dec39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:21 crc kubenswrapper[4913]: I1001 12:52:21.102619 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lj47\" (UniqueName: \"kubernetes.io/projected/0cfc30d4-ee8f-4491-a558-cb067b0dec39-kube-api-access-2lj47\") pod \"memcached-0\" (UID: \"0cfc30d4-ee8f-4491-a558-cb067b0dec39\") " pod="openstack/memcached-0" Oct 01 12:52:21 crc kubenswrapper[4913]: I1001 12:52:21.125971 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:21 crc kubenswrapper[4913]: I1001 12:52:21.214301 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 12:52:22 crc kubenswrapper[4913]: I1001 12:52:22.762932 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:52:22 crc kubenswrapper[4913]: I1001 12:52:22.765432 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 12:52:22 crc kubenswrapper[4913]: I1001 12:52:22.774173 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:52:22 crc kubenswrapper[4913]: I1001 12:52:22.776333 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-b4s69" Oct 01 12:52:22 crc kubenswrapper[4913]: I1001 12:52:22.908059 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjvz4\" (UniqueName: \"kubernetes.io/projected/96fdbf7e-8781-4743-a69c-e56b650fb429-kube-api-access-vjvz4\") pod \"kube-state-metrics-0\" (UID: \"96fdbf7e-8781-4743-a69c-e56b650fb429\") " pod="openstack/kube-state-metrics-0" Oct 01 12:52:23 crc kubenswrapper[4913]: I1001 12:52:23.009633 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjvz4\" (UniqueName: \"kubernetes.io/projected/96fdbf7e-8781-4743-a69c-e56b650fb429-kube-api-access-vjvz4\") pod \"kube-state-metrics-0\" (UID: \"96fdbf7e-8781-4743-a69c-e56b650fb429\") " pod="openstack/kube-state-metrics-0" Oct 01 12:52:23 crc kubenswrapper[4913]: I1001 12:52:23.039139 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjvz4\" (UniqueName: \"kubernetes.io/projected/96fdbf7e-8781-4743-a69c-e56b650fb429-kube-api-access-vjvz4\") pod \"kube-state-metrics-0\" (UID: \"96fdbf7e-8781-4743-a69c-e56b650fb429\") " pod="openstack/kube-state-metrics-0" Oct 01 12:52:23 crc kubenswrapper[4913]: I1001 12:52:23.095785 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 12:52:26 crc kubenswrapper[4913]: I1001 12:52:26.960970 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vjkgr"] Oct 01 12:52:26 crc kubenswrapper[4913]: I1001 12:52:26.974601 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:26 crc kubenswrapper[4913]: I1001 12:52:26.975681 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vjkgr"] Oct 01 12:52:26 crc kubenswrapper[4913]: I1001 12:52:26.976463 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 01 12:52:26 crc kubenswrapper[4913]: I1001 12:52:26.977180 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 01 12:52:26 crc kubenswrapper[4913]: I1001 12:52:26.977987 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-gh4hp" Oct 01 12:52:26 crc kubenswrapper[4913]: I1001 12:52:26.982952 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-7ck28"] Oct 01 12:52:26 crc kubenswrapper[4913]: I1001 12:52:26.985346 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:26 crc kubenswrapper[4913]: I1001 12:52:26.994653 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7ck28"] Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.080160 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da2fc463-557f-4a82-bd58-60b0e08930a4-var-run\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.080204 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtc7j\" (UniqueName: \"kubernetes.io/projected/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-kube-api-access-xtc7j\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.080249 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-var-run-ovn\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.080289 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-ovn-controller-tls-certs\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.080442 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/da2fc463-557f-4a82-bd58-60b0e08930a4-var-log\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.080634 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-combined-ca-bundle\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.080691 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/da2fc463-557f-4a82-bd58-60b0e08930a4-etc-ovs\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.080742 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-scripts\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.080815 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-var-log-ovn\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.080867 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-var-run\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.080911 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/da2fc463-557f-4a82-bd58-60b0e08930a4-var-lib\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.080938 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx47t\" (UniqueName: \"kubernetes.io/projected/da2fc463-557f-4a82-bd58-60b0e08930a4-kube-api-access-xx47t\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.081055 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da2fc463-557f-4a82-bd58-60b0e08930a4-scripts\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.182734 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da2fc463-557f-4a82-bd58-60b0e08930a4-scripts\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.182795 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da2fc463-557f-4a82-bd58-60b0e08930a4-var-run\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.182822 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtc7j\" (UniqueName: \"kubernetes.io/projected/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-kube-api-access-xtc7j\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.182879 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-var-run-ovn\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.182908 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-ovn-controller-tls-certs\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.182952 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/da2fc463-557f-4a82-bd58-60b0e08930a4-var-log\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.182991 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-combined-ca-bundle\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.183017 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/da2fc463-557f-4a82-bd58-60b0e08930a4-etc-ovs\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.183044 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-scripts\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.183096 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-var-log-ovn\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.183122 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-var-run\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.183150 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/da2fc463-557f-4a82-bd58-60b0e08930a4-var-lib\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.183171 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx47t\" (UniqueName: \"kubernetes.io/projected/da2fc463-557f-4a82-bd58-60b0e08930a4-kube-api-access-xx47t\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.183537 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da2fc463-557f-4a82-bd58-60b0e08930a4-var-run\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.183971 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/da2fc463-557f-4a82-bd58-60b0e08930a4-etc-ovs\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.184047 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-var-run\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.184067 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-var-log-ovn\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.184206 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/da2fc463-557f-4a82-bd58-60b0e08930a4-var-lib\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.184541 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/da2fc463-557f-4a82-bd58-60b0e08930a4-var-log\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.184650 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-var-run-ovn\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.184982 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da2fc463-557f-4a82-bd58-60b0e08930a4-scripts\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.186281 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-scripts\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.189954 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-ovn-controller-tls-certs\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.196095 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-combined-ca-bundle\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.201449 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx47t\" (UniqueName: \"kubernetes.io/projected/da2fc463-557f-4a82-bd58-60b0e08930a4-kube-api-access-xx47t\") pod \"ovn-controller-ovs-7ck28\" (UID: \"da2fc463-557f-4a82-bd58-60b0e08930a4\") " pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.216018 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtc7j\" (UniqueName: \"kubernetes.io/projected/e90d4e3a-3c02-4d5d-84f4-32d5cb411f77-kube-api-access-xtc7j\") pod \"ovn-controller-vjkgr\" (UID: \"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77\") " pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.216457 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.218043 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.223617 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.223690 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.223828 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-r5gvw" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.223951 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.224130 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.255479 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.284599 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b728de-af3e-4c33-baad-98a46852c91f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.284640 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b728de-af3e-4c33-baad-98a46852c91f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.284661 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b728de-af3e-4c33-baad-98a46852c91f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.284686 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkr9l\" (UniqueName: \"kubernetes.io/projected/71b728de-af3e-4c33-baad-98a46852c91f-kube-api-access-gkr9l\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.284813 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71b728de-af3e-4c33-baad-98a46852c91f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.284884 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b728de-af3e-4c33-baad-98a46852c91f-config\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.285012 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71b728de-af3e-4c33-baad-98a46852c91f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.285036 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.301285 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.310404 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.386756 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b728de-af3e-4c33-baad-98a46852c91f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.387030 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b728de-af3e-4c33-baad-98a46852c91f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.387056 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkr9l\" (UniqueName: \"kubernetes.io/projected/71b728de-af3e-4c33-baad-98a46852c91f-kube-api-access-gkr9l\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.387108 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71b728de-af3e-4c33-baad-98a46852c91f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.387134 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b728de-af3e-4c33-baad-98a46852c91f-config\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.387199 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.387213 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71b728de-af3e-4c33-baad-98a46852c91f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.387260 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b728de-af3e-4c33-baad-98a46852c91f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.388247 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71b728de-af3e-4c33-baad-98a46852c91f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.388804 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.389011 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b728de-af3e-4c33-baad-98a46852c91f-config\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.389485 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71b728de-af3e-4c33-baad-98a46852c91f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.391472 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b728de-af3e-4c33-baad-98a46852c91f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.394951 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b728de-af3e-4c33-baad-98a46852c91f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.395692 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b728de-af3e-4c33-baad-98a46852c91f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.405966 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkr9l\" (UniqueName: \"kubernetes.io/projected/71b728de-af3e-4c33-baad-98a46852c91f-kube-api-access-gkr9l\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.411602 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"71b728de-af3e-4c33-baad-98a46852c91f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:27 crc kubenswrapper[4913]: I1001 12:52:27.579123 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:28 crc kubenswrapper[4913]: E1001 12:52:28.761886 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b" Oct 01 12:52:28 crc kubenswrapper[4913]: E1001 12:52:28.762250 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xmxl2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-d5f6f49c7-j56vm_openstack(1e122b22-fdd9-485f-92d7-e105f64e8a2f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:52:28 crc kubenswrapper[4913]: E1001 12:52:28.763523 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" podUID="1e122b22-fdd9-485f-92d7-e105f64e8a2f" Oct 01 12:52:28 crc kubenswrapper[4913]: E1001 12:52:28.786903 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b" Oct 01 12:52:28 crc kubenswrapper[4913]: E1001 12:52:28.787084 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-876tw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-b8b69cf79-hlmd2_openstack(d9845757-12e6-42c3-a6e2-89d88cca7bb7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:52:28 crc kubenswrapper[4913]: E1001 12:52:28.789395 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-b8b69cf79-hlmd2" podUID="d9845757-12e6-42c3-a6e2-89d88cca7bb7" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.093484 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:52:29 crc kubenswrapper[4913]: W1001 12:52:29.098539 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19a7d9a6_e81a_483a_8408_6784ded67834.slice/crio-4a117030bf9941b790d08e2ca7167093d342b11941e43bc4cce0ddd8fe4beb5d WatchSource:0}: Error finding container 4a117030bf9941b790d08e2ca7167093d342b11941e43bc4cce0ddd8fe4beb5d: Status 404 returned error can't find the container with id 4a117030bf9941b790d08e2ca7167093d342b11941e43bc4cce0ddd8fe4beb5d Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.235050 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.410077 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vjkgr"] Oct 01 12:52:29 crc kubenswrapper[4913]: W1001 12:52:29.412968 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96fdbf7e_8781_4743_a69c_e56b650fb429.slice/crio-0bc35e333c156d82100adfbe7c30947cf62517717bb45bed4973f9c25432a4e5 WatchSource:0}: Error finding container 0bc35e333c156d82100adfbe7c30947cf62517717bb45bed4973f9c25432a4e5: Status 404 returned error can't find the container with id 0bc35e333c156d82100adfbe7c30947cf62517717bb45bed4973f9c25432a4e5 Oct 01 12:52:29 crc kubenswrapper[4913]: W1001 12:52:29.417875 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode90d4e3a_3c02_4d5d_84f4_32d5cb411f77.slice/crio-b65480e15c3295f8df7e9b21d3380064f2366a8556ca6df09fff0793cac247a9 WatchSource:0}: Error finding container b65480e15c3295f8df7e9b21d3380064f2366a8556ca6df09fff0793cac247a9: Status 404 returned error can't find the container with id b65480e15c3295f8df7e9b21d3380064f2366a8556ca6df09fff0793cac247a9 Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.426420 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.436873 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.444759 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 12:52:29 crc kubenswrapper[4913]: W1001 12:52:29.446186 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe74cb0a_552b_42dd_a1af_acb58e98b7dd.slice/crio-1d2de31499311bed829ff80b4f6fbc2dba8ea6e3a164fadd922a0cf07353592c WatchSource:0}: Error finding container 1d2de31499311bed829ff80b4f6fbc2dba8ea6e3a164fadd922a0cf07353592c: Status 404 returned error can't find the container with id 1d2de31499311bed829ff80b4f6fbc2dba8ea6e3a164fadd922a0cf07353592c Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.450569 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.452361 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.456557 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.456800 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.456939 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.457442 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-jtsj6" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.459731 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.519803 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f930acb-c2e8-4c5f-8d1b-acdc15375467-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.519954 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5f930acb-c2e8-4c5f-8d1b-acdc15375467-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.520150 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f930acb-c2e8-4c5f-8d1b-acdc15375467-config\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.520235 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f930acb-c2e8-4c5f-8d1b-acdc15375467-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.520439 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.520488 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m9fd\" (UniqueName: \"kubernetes.io/projected/5f930acb-c2e8-4c5f-8d1b-acdc15375467-kube-api-access-7m9fd\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.520618 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f930acb-c2e8-4c5f-8d1b-acdc15375467-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.520641 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f930acb-c2e8-4c5f-8d1b-acdc15375467-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.549544 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 12:52:29 crc kubenswrapper[4913]: W1001 12:52:29.552582 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71b728de_af3e_4c33_baad_98a46852c91f.slice/crio-79785a4621f3517bbe9f17010e673b342109a804f2f3545a6d2d6d042cc50d01 WatchSource:0}: Error finding container 79785a4621f3517bbe9f17010e673b342109a804f2f3545a6d2d6d042cc50d01: Status 404 returned error can't find the container with id 79785a4621f3517bbe9f17010e673b342109a804f2f3545a6d2d6d042cc50d01 Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.622424 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f930acb-c2e8-4c5f-8d1b-acdc15375467-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.622460 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f930acb-c2e8-4c5f-8d1b-acdc15375467-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.622508 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f930acb-c2e8-4c5f-8d1b-acdc15375467-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.622551 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5f930acb-c2e8-4c5f-8d1b-acdc15375467-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.622576 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f930acb-c2e8-4c5f-8d1b-acdc15375467-config\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.622594 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f930acb-c2e8-4c5f-8d1b-acdc15375467-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.622630 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.622649 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m9fd\" (UniqueName: \"kubernetes.io/projected/5f930acb-c2e8-4c5f-8d1b-acdc15375467-kube-api-access-7m9fd\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.624411 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f930acb-c2e8-4c5f-8d1b-acdc15375467-config\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.624566 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f930acb-c2e8-4c5f-8d1b-acdc15375467-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.624692 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5f930acb-c2e8-4c5f-8d1b-acdc15375467-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.624936 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.628197 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f930acb-c2e8-4c5f-8d1b-acdc15375467-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.628555 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f930acb-c2e8-4c5f-8d1b-acdc15375467-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.631727 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f930acb-c2e8-4c5f-8d1b-acdc15375467-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.647608 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.648317 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m9fd\" (UniqueName: \"kubernetes.io/projected/5f930acb-c2e8-4c5f-8d1b-acdc15375467-kube-api-access-7m9fd\") pod \"ovsdbserver-sb-0\" (UID: \"5f930acb-c2e8-4c5f-8d1b-acdc15375467\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.651460 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.690588 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vjkgr" event={"ID":"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77","Type":"ContainerStarted","Data":"b65480e15c3295f8df7e9b21d3380064f2366a8556ca6df09fff0793cac247a9"} Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.691508 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"71b728de-af3e-4c33-baad-98a46852c91f","Type":"ContainerStarted","Data":"79785a4621f3517bbe9f17010e673b342109a804f2f3545a6d2d6d042cc50d01"} Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.694341 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19a7d9a6-e81a-483a-8408-6784ded67834","Type":"ContainerStarted","Data":"4a117030bf9941b790d08e2ca7167093d342b11941e43bc4cce0ddd8fe4beb5d"} Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.696134 4913 generic.go:334] "Generic (PLEG): container finished" podID="3e2fa990-1cf6-4826-9bbd-0e02bf405bcb" containerID="7ad946ff417de7525a068efb1840e4e101e5106c6fd05cda75c11ec99a2ea8b3" exitCode=0 Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.696216 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-rksl6" event={"ID":"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb","Type":"ContainerDied","Data":"7ad946ff417de7525a068efb1840e4e101e5106c6fd05cda75c11ec99a2ea8b3"} Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.699373 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fe74cb0a-552b-42dd-a1af-acb58e98b7dd","Type":"ContainerStarted","Data":"1d2de31499311bed829ff80b4f6fbc2dba8ea6e3a164fadd922a0cf07353592c"} Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.702256 4913 generic.go:334] "Generic (PLEG): container finished" podID="c80e2a0f-0343-481e-8939-a9c3c861c50c" containerID="567c14d62421db8f79776009c494e9fb2a4a1482ba08fa8da8eb6d669c05b1e8" exitCode=0 Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.702316 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" event={"ID":"c80e2a0f-0343-481e-8939-a9c3c861c50c","Type":"ContainerDied","Data":"567c14d62421db8f79776009c494e9fb2a4a1482ba08fa8da8eb6d669c05b1e8"} Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.703562 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d","Type":"ContainerStarted","Data":"3a5d71a202910aec7a742c55a38b2b8f52eb5aa0766dd46caaf7f3390bbc0b52"} Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.704625 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0cfc30d4-ee8f-4491-a558-cb067b0dec39","Type":"ContainerStarted","Data":"0aded8dac417eba74e14badda69966b18dfad82ebd88f72ac379260f7a254376"} Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.706014 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"96fdbf7e-8781-4743-a69c-e56b650fb429","Type":"ContainerStarted","Data":"0bc35e333c156d82100adfbe7c30947cf62517717bb45bed4973f9c25432a4e5"} Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.710335 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eafb0f7f-ea12-4d4f-9097-5923b9345bc0","Type":"ContainerStarted","Data":"37514c9b2f9fbd5fa6b35cb513748fb615f5ec8e3d02462e9f8f1425b1c97f4e"} Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.770160 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7ck28"] Oct 01 12:52:29 crc kubenswrapper[4913]: I1001 12:52:29.789103 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:30 crc kubenswrapper[4913]: E1001 12:52:30.076440 4913 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 01 12:52:30 crc kubenswrapper[4913]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/c80e2a0f-0343-481e-8939-a9c3c861c50c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 01 12:52:30 crc kubenswrapper[4913]: > podSandboxID="0b3457e93016f76750d1818bdf16cdf9ac9ab6f3dafbe0f1de430f70ce0cd7fc" Oct 01 12:52:30 crc kubenswrapper[4913]: E1001 12:52:30.077104 4913 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 01 12:52:30 crc kubenswrapper[4913]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqplc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-b6f94bdfc-629ss_openstack(c80e2a0f-0343-481e-8939-a9c3c861c50c): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/c80e2a0f-0343-481e-8939-a9c3c861c50c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 01 12:52:30 crc kubenswrapper[4913]: > logger="UnhandledError" Oct 01 12:52:30 crc kubenswrapper[4913]: E1001 12:52:30.078504 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/c80e2a0f-0343-481e-8939-a9c3c861c50c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" podUID="c80e2a0f-0343-481e-8939-a9c3c861c50c" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.136216 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.140636 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8b69cf79-hlmd2" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.235890 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e122b22-fdd9-485f-92d7-e105f64e8a2f-dns-svc\") pod \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\" (UID: \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\") " Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.235960 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e122b22-fdd9-485f-92d7-e105f64e8a2f-config\") pod \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\" (UID: \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\") " Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.236001 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-876tw\" (UniqueName: \"kubernetes.io/projected/d9845757-12e6-42c3-a6e2-89d88cca7bb7-kube-api-access-876tw\") pod \"d9845757-12e6-42c3-a6e2-89d88cca7bb7\" (UID: \"d9845757-12e6-42c3-a6e2-89d88cca7bb7\") " Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.236047 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9845757-12e6-42c3-a6e2-89d88cca7bb7-config\") pod \"d9845757-12e6-42c3-a6e2-89d88cca7bb7\" (UID: \"d9845757-12e6-42c3-a6e2-89d88cca7bb7\") " Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.236072 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmxl2\" (UniqueName: \"kubernetes.io/projected/1e122b22-fdd9-485f-92d7-e105f64e8a2f-kube-api-access-xmxl2\") pod \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\" (UID: \"1e122b22-fdd9-485f-92d7-e105f64e8a2f\") " Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.237045 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e122b22-fdd9-485f-92d7-e105f64e8a2f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1e122b22-fdd9-485f-92d7-e105f64e8a2f" (UID: "1e122b22-fdd9-485f-92d7-e105f64e8a2f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.237145 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e122b22-fdd9-485f-92d7-e105f64e8a2f-config" (OuterVolumeSpecName: "config") pod "1e122b22-fdd9-485f-92d7-e105f64e8a2f" (UID: "1e122b22-fdd9-485f-92d7-e105f64e8a2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.237587 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9845757-12e6-42c3-a6e2-89d88cca7bb7-config" (OuterVolumeSpecName: "config") pod "d9845757-12e6-42c3-a6e2-89d88cca7bb7" (UID: "d9845757-12e6-42c3-a6e2-89d88cca7bb7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.237822 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9845757-12e6-42c3-a6e2-89d88cca7bb7-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.237844 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e122b22-fdd9-485f-92d7-e105f64e8a2f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.237857 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e122b22-fdd9-485f-92d7-e105f64e8a2f-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.243350 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e122b22-fdd9-485f-92d7-e105f64e8a2f-kube-api-access-xmxl2" (OuterVolumeSpecName: "kube-api-access-xmxl2") pod "1e122b22-fdd9-485f-92d7-e105f64e8a2f" (UID: "1e122b22-fdd9-485f-92d7-e105f64e8a2f"). InnerVolumeSpecName "kube-api-access-xmxl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.243518 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9845757-12e6-42c3-a6e2-89d88cca7bb7-kube-api-access-876tw" (OuterVolumeSpecName: "kube-api-access-876tw") pod "d9845757-12e6-42c3-a6e2-89d88cca7bb7" (UID: "d9845757-12e6-42c3-a6e2-89d88cca7bb7"). InnerVolumeSpecName "kube-api-access-876tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.339792 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-876tw\" (UniqueName: \"kubernetes.io/projected/d9845757-12e6-42c3-a6e2-89d88cca7bb7-kube-api-access-876tw\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.339831 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmxl2\" (UniqueName: \"kubernetes.io/projected/1e122b22-fdd9-485f-92d7-e105f64e8a2f-kube-api-access-xmxl2\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.401321 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 12:52:30 crc kubenswrapper[4913]: W1001 12:52:30.405708 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f930acb_c2e8_4c5f_8d1b_acdc15375467.slice/crio-d4d31da5828524f97dc3beeabb40677eb41bb9b37d79c2c31c77195056c5e6f2 WatchSource:0}: Error finding container d4d31da5828524f97dc3beeabb40677eb41bb9b37d79c2c31c77195056c5e6f2: Status 404 returned error can't find the container with id d4d31da5828524f97dc3beeabb40677eb41bb9b37d79c2c31c77195056c5e6f2 Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.719227 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8b69cf79-hlmd2" event={"ID":"d9845757-12e6-42c3-a6e2-89d88cca7bb7","Type":"ContainerDied","Data":"672984cc48bae814a210b44682e9108d2781e10abd187b43ee081bd215197d5f"} Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.719551 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8b69cf79-hlmd2" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.724879 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7ck28" event={"ID":"da2fc463-557f-4a82-bd58-60b0e08930a4","Type":"ContainerStarted","Data":"54c30d3fbfa2e7d3b55a11cdc7ef7cc1d06d8c32ff281b76698cf6d66a73534d"} Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.730525 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-rksl6" event={"ID":"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb","Type":"ContainerStarted","Data":"bacfa30e75211038dad6d41161894a529ac701b363016d65c9fc4dc35ffa76c7"} Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.730723 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.731656 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5f930acb-c2e8-4c5f-8d1b-acdc15375467","Type":"ContainerStarted","Data":"d4d31da5828524f97dc3beeabb40677eb41bb9b37d79c2c31c77195056c5e6f2"} Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.736130 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" event={"ID":"1e122b22-fdd9-485f-92d7-e105f64e8a2f","Type":"ContainerDied","Data":"a27f3e95f04d0aa47cccd44898ca9add9489f024bc386012aa0c28263d71dd97"} Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.736149 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.752249 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77795d58f5-rksl6" podStartSLOduration=5.872880489 podStartE2EDuration="14.752231699s" podCreationTimestamp="2025-10-01 12:52:16 +0000 UTC" firstStartedPulling="2025-10-01 12:52:20.023967464 +0000 UTC m=+871.927443042" lastFinishedPulling="2025-10-01 12:52:28.903318674 +0000 UTC m=+880.806794252" observedRunningTime="2025-10-01 12:52:30.746573953 +0000 UTC m=+882.650049561" watchObservedRunningTime="2025-10-01 12:52:30.752231699 +0000 UTC m=+882.655707277" Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.819448 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8b69cf79-hlmd2"] Oct 01 12:52:30 crc kubenswrapper[4913]: I1001 12:52:30.825534 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8b69cf79-hlmd2"] Oct 01 12:52:31 crc kubenswrapper[4913]: I1001 12:52:31.745261 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" event={"ID":"c80e2a0f-0343-481e-8939-a9c3c861c50c","Type":"ContainerStarted","Data":"850aceab03ec0644413f06e5e180e9986dccf2cd35d8b891f6555bdfd18c63cc"} Oct 01 12:52:31 crc kubenswrapper[4913]: I1001 12:52:31.746085 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:31 crc kubenswrapper[4913]: I1001 12:52:31.767666 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" podStartSLOduration=6.288113002 podStartE2EDuration="15.767648194s" podCreationTimestamp="2025-10-01 12:52:16 +0000 UTC" firstStartedPulling="2025-10-01 12:52:19.419279799 +0000 UTC m=+871.322755377" lastFinishedPulling="2025-10-01 12:52:28.898814981 +0000 UTC m=+880.802290569" observedRunningTime="2025-10-01 12:52:31.760901187 +0000 UTC m=+883.664376775" watchObservedRunningTime="2025-10-01 12:52:31.767648194 +0000 UTC m=+883.671123762" Oct 01 12:52:32 crc kubenswrapper[4913]: I1001 12:52:32.816713 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9845757-12e6-42c3-a6e2-89d88cca7bb7" path="/var/lib/kubelet/pods/d9845757-12e6-42c3-a6e2-89d88cca7bb7/volumes" Oct 01 12:52:36 crc kubenswrapper[4913]: I1001 12:52:36.640408 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:36 crc kubenswrapper[4913]: I1001 12:52:36.981416 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:52:37 crc kubenswrapper[4913]: I1001 12:52:37.052881 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-629ss"] Oct 01 12:52:37 crc kubenswrapper[4913]: I1001 12:52:37.053235 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" podUID="c80e2a0f-0343-481e-8939-a9c3c861c50c" containerName="dnsmasq-dns" containerID="cri-o://850aceab03ec0644413f06e5e180e9986dccf2cd35d8b891f6555bdfd18c63cc" gracePeriod=10 Oct 01 12:52:37 crc kubenswrapper[4913]: I1001 12:52:37.797134 4913 generic.go:334] "Generic (PLEG): container finished" podID="c80e2a0f-0343-481e-8939-a9c3c861c50c" containerID="850aceab03ec0644413f06e5e180e9986dccf2cd35d8b891f6555bdfd18c63cc" exitCode=0 Oct 01 12:52:37 crc kubenswrapper[4913]: I1001 12:52:37.797197 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" event={"ID":"c80e2a0f-0343-481e-8939-a9c3c861c50c","Type":"ContainerDied","Data":"850aceab03ec0644413f06e5e180e9986dccf2cd35d8b891f6555bdfd18c63cc"} Oct 01 12:52:38 crc kubenswrapper[4913]: I1001 12:52:38.346639 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:38 crc kubenswrapper[4913]: I1001 12:52:38.363986 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c80e2a0f-0343-481e-8939-a9c3c861c50c-config\") pod \"c80e2a0f-0343-481e-8939-a9c3c861c50c\" (UID: \"c80e2a0f-0343-481e-8939-a9c3c861c50c\") " Oct 01 12:52:38 crc kubenswrapper[4913]: I1001 12:52:38.364286 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c80e2a0f-0343-481e-8939-a9c3c861c50c-dns-svc\") pod \"c80e2a0f-0343-481e-8939-a9c3c861c50c\" (UID: \"c80e2a0f-0343-481e-8939-a9c3c861c50c\") " Oct 01 12:52:38 crc kubenswrapper[4913]: I1001 12:52:38.364321 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqplc\" (UniqueName: \"kubernetes.io/projected/c80e2a0f-0343-481e-8939-a9c3c861c50c-kube-api-access-rqplc\") pod \"c80e2a0f-0343-481e-8939-a9c3c861c50c\" (UID: \"c80e2a0f-0343-481e-8939-a9c3c861c50c\") " Oct 01 12:52:38 crc kubenswrapper[4913]: I1001 12:52:38.374575 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c80e2a0f-0343-481e-8939-a9c3c861c50c-kube-api-access-rqplc" (OuterVolumeSpecName: "kube-api-access-rqplc") pod "c80e2a0f-0343-481e-8939-a9c3c861c50c" (UID: "c80e2a0f-0343-481e-8939-a9c3c861c50c"). InnerVolumeSpecName "kube-api-access-rqplc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:38 crc kubenswrapper[4913]: I1001 12:52:38.414659 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c80e2a0f-0343-481e-8939-a9c3c861c50c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c80e2a0f-0343-481e-8939-a9c3c861c50c" (UID: "c80e2a0f-0343-481e-8939-a9c3c861c50c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:38 crc kubenswrapper[4913]: I1001 12:52:38.432190 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c80e2a0f-0343-481e-8939-a9c3c861c50c-config" (OuterVolumeSpecName: "config") pod "c80e2a0f-0343-481e-8939-a9c3c861c50c" (UID: "c80e2a0f-0343-481e-8939-a9c3c861c50c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:38 crc kubenswrapper[4913]: I1001 12:52:38.465448 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c80e2a0f-0343-481e-8939-a9c3c861c50c-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:38 crc kubenswrapper[4913]: I1001 12:52:38.465481 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c80e2a0f-0343-481e-8939-a9c3c861c50c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:38 crc kubenswrapper[4913]: I1001 12:52:38.465491 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqplc\" (UniqueName: \"kubernetes.io/projected/c80e2a0f-0343-481e-8939-a9c3c861c50c-kube-api-access-rqplc\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:38 crc kubenswrapper[4913]: I1001 12:52:38.819544 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" Oct 01 12:52:38 crc kubenswrapper[4913]: I1001 12:52:38.820100 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-629ss" event={"ID":"c80e2a0f-0343-481e-8939-a9c3c861c50c","Type":"ContainerDied","Data":"0b3457e93016f76750d1818bdf16cdf9ac9ab6f3dafbe0f1de430f70ce0cd7fc"} Oct 01 12:52:38 crc kubenswrapper[4913]: I1001 12:52:38.821503 4913 scope.go:117] "RemoveContainer" containerID="850aceab03ec0644413f06e5e180e9986dccf2cd35d8b891f6555bdfd18c63cc" Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.055078 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-629ss"] Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.059670 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-629ss"] Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.174438 4913 scope.go:117] "RemoveContainer" containerID="567c14d62421db8f79776009c494e9fb2a4a1482ba08fa8da8eb6d669c05b1e8" Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.817195 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5f930acb-c2e8-4c5f-8d1b-acdc15375467","Type":"ContainerStarted","Data":"989a0966f8055ad8bc734aea50b26b0b8dcdf9b2cfaf3f2a9b4d15d2402bb460"} Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.819321 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vjkgr" event={"ID":"e90d4e3a-3c02-4d5d-84f4-32d5cb411f77","Type":"ContainerStarted","Data":"65b6058bc205c6caff20c90d0e915dc4f689e3a6daa3fc6ccef28374cc87d6ed"} Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.819471 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-vjkgr" Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.821239 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"71b728de-af3e-4c33-baad-98a46852c91f","Type":"ContainerStarted","Data":"e5843787f53e8b4ce005f449e1cf0c0e248445ef9ce597071ad95a06a444fdb1"} Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.823381 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7ck28" event={"ID":"da2fc463-557f-4a82-bd58-60b0e08930a4","Type":"ContainerStarted","Data":"0fffff4e50feee3faae73ba8b85af51d54cde888b3eac66174c1f560023d75af"} Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.824905 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fe74cb0a-552b-42dd-a1af-acb58e98b7dd","Type":"ContainerStarted","Data":"1d257d50687b1513d189ad4d286be9013732bbd94289e97e91d48e63f49f95d5"} Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.828465 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d","Type":"ContainerStarted","Data":"318d2efcb1bd0d803fea91494de10dede631b7ed91556b96a5548ff835c08a84"} Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.830492 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0cfc30d4-ee8f-4491-a558-cb067b0dec39","Type":"ContainerStarted","Data":"a3691f3408021c0824e57cd89cbca30352f688630aee942d81692fa737b20b04"} Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.830584 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.832714 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"96fdbf7e-8781-4743-a69c-e56b650fb429","Type":"ContainerStarted","Data":"0ae55ac73c73fa949966ee095924415c34a989f84ed66198458548d86fa90149"} Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.832851 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.839529 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vjkgr" podStartSLOduration=4.888498532 podStartE2EDuration="13.839498592s" podCreationTimestamp="2025-10-01 12:52:26 +0000 UTC" firstStartedPulling="2025-10-01 12:52:29.418665012 +0000 UTC m=+881.322140590" lastFinishedPulling="2025-10-01 12:52:38.369665072 +0000 UTC m=+890.273140650" observedRunningTime="2025-10-01 12:52:39.839313317 +0000 UTC m=+891.742788915" watchObservedRunningTime="2025-10-01 12:52:39.839498592 +0000 UTC m=+891.742974170" Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.963661 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.597764079 podStartE2EDuration="19.963642979s" podCreationTimestamp="2025-10-01 12:52:20 +0000 UTC" firstStartedPulling="2025-10-01 12:52:29.435108776 +0000 UTC m=+881.338584354" lastFinishedPulling="2025-10-01 12:52:35.800987676 +0000 UTC m=+887.704463254" observedRunningTime="2025-10-01 12:52:39.958499238 +0000 UTC m=+891.861974836" watchObservedRunningTime="2025-10-01 12:52:39.963642979 +0000 UTC m=+891.867118557" Oct 01 12:52:39 crc kubenswrapper[4913]: I1001 12:52:39.983232 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.074866089 podStartE2EDuration="17.983206549s" podCreationTimestamp="2025-10-01 12:52:22 +0000 UTC" firstStartedPulling="2025-10-01 12:52:29.415481345 +0000 UTC m=+881.318956923" lastFinishedPulling="2025-10-01 12:52:39.323821815 +0000 UTC m=+891.227297383" observedRunningTime="2025-10-01 12:52:39.977201073 +0000 UTC m=+891.880676661" watchObservedRunningTime="2025-10-01 12:52:39.983206549 +0000 UTC m=+891.886682127" Oct 01 12:52:40 crc kubenswrapper[4913]: I1001 12:52:40.083688 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:52:40 crc kubenswrapper[4913]: I1001 12:52:40.084100 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:52:40 crc kubenswrapper[4913]: I1001 12:52:40.819670 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c80e2a0f-0343-481e-8939-a9c3c861c50c" path="/var/lib/kubelet/pods/c80e2a0f-0343-481e-8939-a9c3c861c50c/volumes" Oct 01 12:52:40 crc kubenswrapper[4913]: I1001 12:52:40.844302 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19a7d9a6-e81a-483a-8408-6784ded67834","Type":"ContainerStarted","Data":"b702e6d96a91056bec1b26da179fba714390228d163abda83dd7909687a79615"} Oct 01 12:52:40 crc kubenswrapper[4913]: I1001 12:52:40.850189 4913 generic.go:334] "Generic (PLEG): container finished" podID="da2fc463-557f-4a82-bd58-60b0e08930a4" containerID="0fffff4e50feee3faae73ba8b85af51d54cde888b3eac66174c1f560023d75af" exitCode=0 Oct 01 12:52:40 crc kubenswrapper[4913]: I1001 12:52:40.850351 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7ck28" event={"ID":"da2fc463-557f-4a82-bd58-60b0e08930a4","Type":"ContainerDied","Data":"0fffff4e50feee3faae73ba8b85af51d54cde888b3eac66174c1f560023d75af"} Oct 01 12:52:40 crc kubenswrapper[4913]: I1001 12:52:40.855636 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eafb0f7f-ea12-4d4f-9097-5923b9345bc0","Type":"ContainerStarted","Data":"0af59876f943b701332c73d88665e99a191224ea85bade54965220f15ee27b27"} Oct 01 12:52:41 crc kubenswrapper[4913]: I1001 12:52:41.864286 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7ck28" event={"ID":"da2fc463-557f-4a82-bd58-60b0e08930a4","Type":"ContainerStarted","Data":"39ba75ad80e9d286f4bdafb9c4832609a4e4593f1c110b5238085dc6944644bf"} Oct 01 12:52:42 crc kubenswrapper[4913]: I1001 12:52:42.876110 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7ck28" event={"ID":"da2fc463-557f-4a82-bd58-60b0e08930a4","Type":"ContainerStarted","Data":"61bf7b629b794f79f71d32e482524c8567691f05b79a337baf80505fd30b7daa"} Oct 01 12:52:42 crc kubenswrapper[4913]: I1001 12:52:42.876461 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:42 crc kubenswrapper[4913]: I1001 12:52:42.876473 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:52:42 crc kubenswrapper[4913]: I1001 12:52:42.878566 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5f930acb-c2e8-4c5f-8d1b-acdc15375467","Type":"ContainerStarted","Data":"25e9afe4b7947e99efedadf0cfe93f201af1bbceaf0bf85124c763623262425a"} Oct 01 12:52:42 crc kubenswrapper[4913]: I1001 12:52:42.881089 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"71b728de-af3e-4c33-baad-98a46852c91f","Type":"ContainerStarted","Data":"4ab9888d2bb011b95dcae8f693f198187d60415a4f497e83f9dbffdefde31872"} Oct 01 12:52:42 crc kubenswrapper[4913]: I1001 12:52:42.900755 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-7ck28" podStartSLOduration=8.390821056 podStartE2EDuration="16.900714537s" podCreationTimestamp="2025-10-01 12:52:26 +0000 UTC" firstStartedPulling="2025-10-01 12:52:29.767472453 +0000 UTC m=+881.670948031" lastFinishedPulling="2025-10-01 12:52:38.277365934 +0000 UTC m=+890.180841512" observedRunningTime="2025-10-01 12:52:42.897874338 +0000 UTC m=+894.801349916" watchObservedRunningTime="2025-10-01 12:52:42.900714537 +0000 UTC m=+894.804190115" Oct 01 12:52:42 crc kubenswrapper[4913]: I1001 12:52:42.931671 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.061227553 podStartE2EDuration="16.93163336s" podCreationTimestamp="2025-10-01 12:52:26 +0000 UTC" firstStartedPulling="2025-10-01 12:52:29.554689428 +0000 UTC m=+881.458165006" lastFinishedPulling="2025-10-01 12:52:42.425095235 +0000 UTC m=+894.328570813" observedRunningTime="2025-10-01 12:52:42.927020222 +0000 UTC m=+894.830495830" watchObservedRunningTime="2025-10-01 12:52:42.93163336 +0000 UTC m=+894.835108938" Oct 01 12:52:42 crc kubenswrapper[4913]: I1001 12:52:42.958689 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=2.947402109 podStartE2EDuration="14.958666107s" podCreationTimestamp="2025-10-01 12:52:28 +0000 UTC" firstStartedPulling="2025-10-01 12:52:30.408635504 +0000 UTC m=+882.312111092" lastFinishedPulling="2025-10-01 12:52:42.419899522 +0000 UTC m=+894.323375090" observedRunningTime="2025-10-01 12:52:42.958614544 +0000 UTC m=+894.862090142" watchObservedRunningTime="2025-10-01 12:52:42.958666107 +0000 UTC m=+894.862141675" Oct 01 12:52:43 crc kubenswrapper[4913]: I1001 12:52:43.902958 4913 generic.go:334] "Generic (PLEG): container finished" podID="fe74cb0a-552b-42dd-a1af-acb58e98b7dd" containerID="1d257d50687b1513d189ad4d286be9013732bbd94289e97e91d48e63f49f95d5" exitCode=0 Oct 01 12:52:43 crc kubenswrapper[4913]: I1001 12:52:43.903068 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fe74cb0a-552b-42dd-a1af-acb58e98b7dd","Type":"ContainerDied","Data":"1d257d50687b1513d189ad4d286be9013732bbd94289e97e91d48e63f49f95d5"} Oct 01 12:52:43 crc kubenswrapper[4913]: I1001 12:52:43.905083 4913 generic.go:334] "Generic (PLEG): container finished" podID="f9cea593-9b1f-44f0-8f3a-831b4b0ee98d" containerID="318d2efcb1bd0d803fea91494de10dede631b7ed91556b96a5548ff835c08a84" exitCode=0 Oct 01 12:52:43 crc kubenswrapper[4913]: I1001 12:52:43.905194 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d","Type":"ContainerDied","Data":"318d2efcb1bd0d803fea91494de10dede631b7ed91556b96a5548ff835c08a84"} Oct 01 12:52:44 crc kubenswrapper[4913]: I1001 12:52:44.790251 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:44 crc kubenswrapper[4913]: I1001 12:52:44.790621 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:44 crc kubenswrapper[4913]: I1001 12:52:44.838736 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:44 crc kubenswrapper[4913]: I1001 12:52:44.913772 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fe74cb0a-552b-42dd-a1af-acb58e98b7dd","Type":"ContainerStarted","Data":"1f355732ed1d99f0c495ab6ff18224501b1b635849540821bd4f8709629e1c19"} Oct 01 12:52:44 crc kubenswrapper[4913]: I1001 12:52:44.916926 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f9cea593-9b1f-44f0-8f3a-831b4b0ee98d","Type":"ContainerStarted","Data":"3610c284a082ac17a31f402aa9672201d778d5aa19c198c5e3bd13d98d91b424"} Oct 01 12:52:44 crc kubenswrapper[4913]: I1001 12:52:44.981829 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.249700 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86f865fc45-5rgmz"] Oct 01 12:52:45 crc kubenswrapper[4913]: E1001 12:52:45.250402 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c80e2a0f-0343-481e-8939-a9c3c861c50c" containerName="dnsmasq-dns" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.250427 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c80e2a0f-0343-481e-8939-a9c3c861c50c" containerName="dnsmasq-dns" Oct 01 12:52:45 crc kubenswrapper[4913]: E1001 12:52:45.250450 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c80e2a0f-0343-481e-8939-a9c3c861c50c" containerName="init" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.250460 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c80e2a0f-0343-481e-8939-a9c3c861c50c" containerName="init" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.250659 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c80e2a0f-0343-481e-8939-a9c3c861c50c" containerName="dnsmasq-dns" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.251700 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.253795 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.267290 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86f865fc45-5rgmz"] Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.299290 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-k7dpl"] Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.300205 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.301981 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.313507 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-k7dpl"] Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.375421 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-ovsdbserver-sb\") pod \"dnsmasq-dns-86f865fc45-5rgmz\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.375513 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-dns-svc\") pod \"dnsmasq-dns-86f865fc45-5rgmz\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.375674 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-config\") pod \"dnsmasq-dns-86f865fc45-5rgmz\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.375717 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln4qp\" (UniqueName: \"kubernetes.io/projected/a991795a-f96b-477c-9b51-eb160a6d1878-kube-api-access-ln4qp\") pod \"dnsmasq-dns-86f865fc45-5rgmz\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.477500 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/59c8670e-1109-40cd-a637-636023bbd6d5-ovn-rundir\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.477567 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-ovsdbserver-sb\") pod \"dnsmasq-dns-86f865fc45-5rgmz\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.477610 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c8670e-1109-40cd-a637-636023bbd6d5-combined-ca-bundle\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.477637 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-dns-svc\") pod \"dnsmasq-dns-86f865fc45-5rgmz\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.477655 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c8670e-1109-40cd-a637-636023bbd6d5-config\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.477859 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59c8670e-1109-40cd-a637-636023bbd6d5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.478037 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-config\") pod \"dnsmasq-dns-86f865fc45-5rgmz\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.478074 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln4qp\" (UniqueName: \"kubernetes.io/projected/a991795a-f96b-477c-9b51-eb160a6d1878-kube-api-access-ln4qp\") pod \"dnsmasq-dns-86f865fc45-5rgmz\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.478121 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/59c8670e-1109-40cd-a637-636023bbd6d5-ovs-rundir\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.478219 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6hxp\" (UniqueName: \"kubernetes.io/projected/59c8670e-1109-40cd-a637-636023bbd6d5-kube-api-access-q6hxp\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.478692 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-dns-svc\") pod \"dnsmasq-dns-86f865fc45-5rgmz\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.478975 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-config\") pod \"dnsmasq-dns-86f865fc45-5rgmz\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.479234 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-ovsdbserver-sb\") pod \"dnsmasq-dns-86f865fc45-5rgmz\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.518331 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln4qp\" (UniqueName: \"kubernetes.io/projected/a991795a-f96b-477c-9b51-eb160a6d1878-kube-api-access-ln4qp\") pod \"dnsmasq-dns-86f865fc45-5rgmz\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.568509 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.579864 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59c8670e-1109-40cd-a637-636023bbd6d5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.579925 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/59c8670e-1109-40cd-a637-636023bbd6d5-ovs-rundir\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.579961 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6hxp\" (UniqueName: \"kubernetes.io/projected/59c8670e-1109-40cd-a637-636023bbd6d5-kube-api-access-q6hxp\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.580001 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/59c8670e-1109-40cd-a637-636023bbd6d5-ovn-rundir\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.580038 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c8670e-1109-40cd-a637-636023bbd6d5-combined-ca-bundle\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.580054 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c8670e-1109-40cd-a637-636023bbd6d5-config\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.581192 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/59c8670e-1109-40cd-a637-636023bbd6d5-ovn-rundir\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.581213 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/59c8670e-1109-40cd-a637-636023bbd6d5-ovs-rundir\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.581636 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.582037 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c8670e-1109-40cd-a637-636023bbd6d5-config\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.586049 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c8670e-1109-40cd-a637-636023bbd6d5-combined-ca-bundle\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.592869 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59c8670e-1109-40cd-a637-636023bbd6d5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.612981 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6hxp\" (UniqueName: \"kubernetes.io/projected/59c8670e-1109-40cd-a637-636023bbd6d5-kube-api-access-q6hxp\") pod \"ovn-controller-metrics-k7dpl\" (UID: \"59c8670e-1109-40cd-a637-636023bbd6d5\") " pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.613652 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86f865fc45-5rgmz"] Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.620709 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-k7dpl" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.632373 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-8fr2n"] Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.633929 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.637575 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.645852 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-8fr2n"] Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.679942 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.783896 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-config\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.784390 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-ovsdbserver-nb\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.784447 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-ovsdbserver-sb\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.784492 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-dns-svc\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.784534 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kdsd\" (UniqueName: \"kubernetes.io/projected/3f29a2a3-85a3-40a0-ab42-1e575dea129c-kube-api-access-5kdsd\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.886670 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-ovsdbserver-sb\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.886736 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-dns-svc\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.886762 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kdsd\" (UniqueName: \"kubernetes.io/projected/3f29a2a3-85a3-40a0-ab42-1e575dea129c-kube-api-access-5kdsd\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.886846 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-config\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.886866 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-ovsdbserver-nb\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.887912 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-ovsdbserver-nb\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.888485 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-dns-svc\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.888499 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-ovsdbserver-sb\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.888698 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-config\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.913814 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kdsd\" (UniqueName: \"kubernetes.io/projected/3f29a2a3-85a3-40a0-ab42-1e575dea129c-kube-api-access-5kdsd\") pod \"dnsmasq-dns-5d86d68bf7-8fr2n\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.926610 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.948915 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=18.096255826 podStartE2EDuration="26.948898481s" podCreationTimestamp="2025-10-01 12:52:19 +0000 UTC" firstStartedPulling="2025-10-01 12:52:29.657931538 +0000 UTC m=+881.561407116" lastFinishedPulling="2025-10-01 12:52:38.510574193 +0000 UTC m=+890.414049771" observedRunningTime="2025-10-01 12:52:45.946091624 +0000 UTC m=+897.849567222" watchObservedRunningTime="2025-10-01 12:52:45.948898481 +0000 UTC m=+897.852374049" Oct 01 12:52:45 crc kubenswrapper[4913]: I1001 12:52:45.966358 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.017896 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.082038 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.098678 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86f865fc45-5rgmz"] Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.099055 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.108875 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.109141 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-x2m79" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.118589 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.125199 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.147835 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.203333 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-scripts\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.203377 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.203417 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.203478 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-config\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.203500 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.203517 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.203542 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bp4g\" (UniqueName: \"kubernetes.io/projected/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-kube-api-access-4bp4g\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.218174 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.275371 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-k7dpl"] Oct 01 12:52:46 crc kubenswrapper[4913]: W1001 12:52:46.291578 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59c8670e_1109_40cd_a637_636023bbd6d5.slice/crio-937801a36e4ab007fad86a3aa53c20ca054ddd5e6194324b22297c6de69f3836 WatchSource:0}: Error finding container 937801a36e4ab007fad86a3aa53c20ca054ddd5e6194324b22297c6de69f3836: Status 404 returned error can't find the container with id 937801a36e4ab007fad86a3aa53c20ca054ddd5e6194324b22297c6de69f3836 Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.305185 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.305231 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.305304 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bp4g\" (UniqueName: \"kubernetes.io/projected/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-kube-api-access-4bp4g\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.306235 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-scripts\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.306947 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-scripts\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.306989 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.307036 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.307129 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-config\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.307701 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-config\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.307960 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.316680 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.317029 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.327760 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.335537 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bp4g\" (UniqueName: \"kubernetes.io/projected/4ed4667c-0b5b-4e01-b482-4ecb3caebbad-kube-api-access-4bp4g\") pod \"ovn-northd-0\" (UID: \"4ed4667c-0b5b-4e01-b482-4ecb3caebbad\") " pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.481006 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.647057 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-8fr2n"] Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.925401 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 12:52:46 crc kubenswrapper[4913]: W1001 12:52:46.934357 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ed4667c_0b5b_4e01_b482_4ecb3caebbad.slice/crio-905adb9d1faa280df534cbf757ab096e69ba1c5ee70fe59e3f1ecf8cd75ec14f WatchSource:0}: Error finding container 905adb9d1faa280df534cbf757ab096e69ba1c5ee70fe59e3f1ecf8cd75ec14f: Status 404 returned error can't find the container with id 905adb9d1faa280df534cbf757ab096e69ba1c5ee70fe59e3f1ecf8cd75ec14f Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.935839 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-k7dpl" event={"ID":"59c8670e-1109-40cd-a637-636023bbd6d5","Type":"ContainerStarted","Data":"937801a36e4ab007fad86a3aa53c20ca054ddd5e6194324b22297c6de69f3836"} Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.938281 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" event={"ID":"3f29a2a3-85a3-40a0-ab42-1e575dea129c","Type":"ContainerStarted","Data":"24876a6dad316d09d49e8c1251d537c6db18e24ce52e374c635f2d604950d138"} Oct 01 12:52:46 crc kubenswrapper[4913]: I1001 12:52:46.939244 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" event={"ID":"a991795a-f96b-477c-9b51-eb160a6d1878","Type":"ContainerStarted","Data":"38f550c474cfc7ccf20c88cde4c54b3a16607c80f172624eed43ae1d704e5ee1"} Oct 01 12:52:47 crc kubenswrapper[4913]: I1001 12:52:47.947574 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4ed4667c-0b5b-4e01-b482-4ecb3caebbad","Type":"ContainerStarted","Data":"905adb9d1faa280df534cbf757ab096e69ba1c5ee70fe59e3f1ecf8cd75ec14f"} Oct 01 12:52:48 crc kubenswrapper[4913]: I1001 12:52:48.976499 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.863578037 podStartE2EDuration="29.976480617s" podCreationTimestamp="2025-10-01 12:52:19 +0000 UTC" firstStartedPulling="2025-10-01 12:52:29.450664506 +0000 UTC m=+881.354140074" lastFinishedPulling="2025-10-01 12:52:38.563567066 +0000 UTC m=+890.467042654" observedRunningTime="2025-10-01 12:52:48.973242798 +0000 UTC m=+900.876718406" watchObservedRunningTime="2025-10-01 12:52:48.976480617 +0000 UTC m=+900.879956195" Oct 01 12:52:50 crc kubenswrapper[4913]: I1001 12:52:50.796154 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 01 12:52:50 crc kubenswrapper[4913]: I1001 12:52:50.796528 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 01 12:52:51 crc kubenswrapper[4913]: I1001 12:52:51.126229 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:51 crc kubenswrapper[4913]: I1001 12:52:51.126361 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:53 crc kubenswrapper[4913]: I1001 12:52:53.103554 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 12:53:00 crc kubenswrapper[4913]: E1001 12:53:00.548424 4913 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.142:55370->38.102.83.142:36077: write tcp 38.102.83.142:55370->38.102.83.142:36077: write: broken pipe Oct 01 12:53:00 crc kubenswrapper[4913]: I1001 12:53:00.837859 4913 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod1e122b22-fdd9-485f-92d7-e105f64e8a2f"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod1e122b22-fdd9-485f-92d7-e105f64e8a2f] : Timed out while waiting for systemd to remove kubepods-besteffort-pod1e122b22_fdd9_485f_92d7_e105f64e8a2f.slice" Oct 01 12:53:00 crc kubenswrapper[4913]: E1001 12:53:00.838169 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod1e122b22-fdd9-485f-92d7-e105f64e8a2f] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod1e122b22-fdd9-485f-92d7-e105f64e8a2f] : Timed out while waiting for systemd to remove kubepods-besteffort-pod1e122b22_fdd9_485f_92d7_e105f64e8a2f.slice" pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" podUID="1e122b22-fdd9-485f-92d7-e105f64e8a2f" Oct 01 12:53:01 crc kubenswrapper[4913]: I1001 12:53:01.056917 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f6f49c7-j56vm" Oct 01 12:53:01 crc kubenswrapper[4913]: I1001 12:53:01.121677 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-j56vm"] Oct 01 12:53:01 crc kubenswrapper[4913]: I1001 12:53:01.126713 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-j56vm"] Oct 01 12:53:02 crc kubenswrapper[4913]: I1001 12:53:02.817863 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e122b22-fdd9-485f-92d7-e105f64e8a2f" path="/var/lib/kubelet/pods/1e122b22-fdd9-485f-92d7-e105f64e8a2f/volumes" Oct 01 12:53:08 crc kubenswrapper[4913]: E1001 12:53:08.686602 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2090084178/1\": happened during read: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:0a0bbe43e3c266dfeb40a09036f76393dc70377b636724c130a29c434f6d6c82" Oct 01 12:53:08 crc kubenswrapper[4913]: E1001 12:53:08.687361 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-northd,Image:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:0a0bbe43e3c266dfeb40a09036f76393dc70377b636724c130a29c434f6d6c82,Command:[/usr/bin/ovn-northd],Args:[-vfile:off -vconsole:info --n-threads=1 --ovnnb-db=ssl:ovsdbserver-nb-0.openstack.svc.cluster.local:6641 --ovnsb-db=ssl:ovsdbserver-sb-0.openstack.svc.cluster.local:6642 --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:certs,Value:n5b4h6dh5cfh5bh689hcbh5cdhf5hd5hcfh97h5fch594h69h578h5fchf8h9ch594h64fh5f7hch568h66fh5c7hchd9h8dh64bh59h559h6dq,ValueFrom:nil,},EnvVar{Name:ovnnorthd-config,Value:n5c8h7ch56bh8dh8hc4h5dch9dh68h6bhb7h598h549h5dbh66fh6bh5b4h5cch5d6h55ch57fhfch588h89h5ddh5d6h65bh65bh8dhc4h67dh569q,ValueFrom:nil,},EnvVar{Name:ovnnorthd-scripts,Value:n664hd8h66ch58dh64hc9h66bhd4h558h697h67bh557hdch664h567h669h555h696h556h556h5fh5bh569hbh665h9dh4h9bh564hc8h5b7h5c4q,ValueFrom:nil,},EnvVar{Name:tls-ca-bundle.pem,Value:nd5h579h5fch579h68ch5ch79h68dh559h557h55ch97h547h686h65fh569h5f4h585h586h557h556h5dh584h5fbh5ddh89h5ch5cbh698h5f8h647h65dq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4bp4g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/status_check.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/status_check.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-northd-0_openstack(4ed4667c-0b5b-4e01-b482-4ecb3caebbad): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2090084178/1\": happened during read: context canceled" logger="UnhandledError" Oct 01 12:53:08 crc kubenswrapper[4913]: I1001 12:53:08.714608 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 01 12:53:08 crc kubenswrapper[4913]: I1001 12:53:08.772660 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 01 12:53:08 crc kubenswrapper[4913]: E1001 12:53:08.960534 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage2090084178/1\\\": happened during read: context canceled\"" pod="openstack/ovn-northd-0" podUID="4ed4667c-0b5b-4e01-b482-4ecb3caebbad" Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.129786 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-k7dpl" event={"ID":"59c8670e-1109-40cd-a637-636023bbd6d5","Type":"ContainerStarted","Data":"4ad28b7123cca4b0e71200cf16b5f70bdd170119ef10e46c8ece692fbcaefd1b"} Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.131188 4913 generic.go:334] "Generic (PLEG): container finished" podID="3f29a2a3-85a3-40a0-ab42-1e575dea129c" containerID="d29fabf35383083e92e42434b0257736d224722a8cf16a02142b5a468ce773f1" exitCode=0 Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.131255 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" event={"ID":"3f29a2a3-85a3-40a0-ab42-1e575dea129c","Type":"ContainerDied","Data":"d29fabf35383083e92e42434b0257736d224722a8cf16a02142b5a468ce773f1"} Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.132821 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4ed4667c-0b5b-4e01-b482-4ecb3caebbad","Type":"ContainerStarted","Data":"886b4206991b53ec6568f6939c48eba71a01570627b927085e661363cab56f81"} Oct 01 12:53:09 crc kubenswrapper[4913]: E1001 12:53:09.134542 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:0a0bbe43e3c266dfeb40a09036f76393dc70377b636724c130a29c434f6d6c82\\\"\"" pod="openstack/ovn-northd-0" podUID="4ed4667c-0b5b-4e01-b482-4ecb3caebbad" Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.140758 4913 generic.go:334] "Generic (PLEG): container finished" podID="a991795a-f96b-477c-9b51-eb160a6d1878" containerID="e16d938793b31f7873b85d22def42b66ba2f3d1c9cf76770ff4047f295d3034e" exitCode=0 Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.140860 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" event={"ID":"a991795a-f96b-477c-9b51-eb160a6d1878","Type":"ContainerDied","Data":"e16d938793b31f7873b85d22def42b66ba2f3d1c9cf76770ff4047f295d3034e"} Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.165477 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-k7dpl" podStartSLOduration=24.165461591 podStartE2EDuration="24.165461591s" podCreationTimestamp="2025-10-01 12:52:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:09.148991231 +0000 UTC m=+921.052466829" watchObservedRunningTime="2025-10-01 12:53:09.165461591 +0000 UTC m=+921.068937169" Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.455082 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.610245 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-config\") pod \"a991795a-f96b-477c-9b51-eb160a6d1878\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.610637 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln4qp\" (UniqueName: \"kubernetes.io/projected/a991795a-f96b-477c-9b51-eb160a6d1878-kube-api-access-ln4qp\") pod \"a991795a-f96b-477c-9b51-eb160a6d1878\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.610681 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-dns-svc\") pod \"a991795a-f96b-477c-9b51-eb160a6d1878\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.610748 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-ovsdbserver-sb\") pod \"a991795a-f96b-477c-9b51-eb160a6d1878\" (UID: \"a991795a-f96b-477c-9b51-eb160a6d1878\") " Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.615282 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a991795a-f96b-477c-9b51-eb160a6d1878-kube-api-access-ln4qp" (OuterVolumeSpecName: "kube-api-access-ln4qp") pod "a991795a-f96b-477c-9b51-eb160a6d1878" (UID: "a991795a-f96b-477c-9b51-eb160a6d1878"). InnerVolumeSpecName "kube-api-access-ln4qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.630359 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-config" (OuterVolumeSpecName: "config") pod "a991795a-f96b-477c-9b51-eb160a6d1878" (UID: "a991795a-f96b-477c-9b51-eb160a6d1878"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.631464 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a991795a-f96b-477c-9b51-eb160a6d1878" (UID: "a991795a-f96b-477c-9b51-eb160a6d1878"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.632816 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a991795a-f96b-477c-9b51-eb160a6d1878" (UID: "a991795a-f96b-477c-9b51-eb160a6d1878"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.712052 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.712083 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.712093 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln4qp\" (UniqueName: \"kubernetes.io/projected/a991795a-f96b-477c-9b51-eb160a6d1878-kube-api-access-ln4qp\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:09 crc kubenswrapper[4913]: I1001 12:53:09.712103 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a991795a-f96b-477c-9b51-eb160a6d1878-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.083679 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.083725 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.148360 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.148361 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f865fc45-5rgmz" event={"ID":"a991795a-f96b-477c-9b51-eb160a6d1878","Type":"ContainerDied","Data":"38f550c474cfc7ccf20c88cde4c54b3a16607c80f172624eed43ae1d704e5ee1"} Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.148516 4913 scope.go:117] "RemoveContainer" containerID="e16d938793b31f7873b85d22def42b66ba2f3d1c9cf76770ff4047f295d3034e" Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.150723 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" event={"ID":"3f29a2a3-85a3-40a0-ab42-1e575dea129c","Type":"ContainerStarted","Data":"ba1215be9799425e72eaaeaa1a675a103ec50f80a68c1514503c9a71e2260e8c"} Oct 01 12:53:10 crc kubenswrapper[4913]: E1001 12:53:10.152323 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:0a0bbe43e3c266dfeb40a09036f76393dc70377b636724c130a29c434f6d6c82\\\"\"" pod="openstack/ovn-northd-0" podUID="4ed4667c-0b5b-4e01-b482-4ecb3caebbad" Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.170557 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" podStartSLOduration=25.170530644 podStartE2EDuration="25.170530644s" podCreationTimestamp="2025-10-01 12:52:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:10.168047296 +0000 UTC m=+922.071522874" watchObservedRunningTime="2025-10-01 12:53:10.170530644 +0000 UTC m=+922.074006212" Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.229395 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86f865fc45-5rgmz"] Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.234389 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86f865fc45-5rgmz"] Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.817646 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a991795a-f96b-477c-9b51-eb160a6d1878" path="/var/lib/kubelet/pods/a991795a-f96b-477c-9b51-eb160a6d1878/volumes" Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.848690 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-5b585"] Oct 01 12:53:10 crc kubenswrapper[4913]: E1001 12:53:10.849021 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a991795a-f96b-477c-9b51-eb160a6d1878" containerName="init" Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.849041 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="a991795a-f96b-477c-9b51-eb160a6d1878" containerName="init" Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.849216 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="a991795a-f96b-477c-9b51-eb160a6d1878" containerName="init" Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.849841 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5b585" Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.855739 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-5b585"] Oct 01 12:53:10 crc kubenswrapper[4913]: I1001 12:53:10.933755 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4hh5\" (UniqueName: \"kubernetes.io/projected/f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e-kube-api-access-q4hh5\") pod \"keystone-db-create-5b585\" (UID: \"f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e\") " pod="openstack/keystone-db-create-5b585" Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.018471 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.035428 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4hh5\" (UniqueName: \"kubernetes.io/projected/f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e-kube-api-access-q4hh5\") pod \"keystone-db-create-5b585\" (UID: \"f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e\") " pod="openstack/keystone-db-create-5b585" Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.061925 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4hh5\" (UniqueName: \"kubernetes.io/projected/f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e-kube-api-access-q4hh5\") pod \"keystone-db-create-5b585\" (UID: \"f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e\") " pod="openstack/keystone-db-create-5b585" Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.083161 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-g8njq"] Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.084612 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-g8njq" Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.099402 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-g8njq"] Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.167596 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5b585" Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.238533 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d9sr\" (UniqueName: \"kubernetes.io/projected/0410901f-4718-42a7-9e61-a64722c67b5c-kube-api-access-9d9sr\") pod \"placement-db-create-g8njq\" (UID: \"0410901f-4718-42a7-9e61-a64722c67b5c\") " pod="openstack/placement-db-create-g8njq" Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.303749 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.340414 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d9sr\" (UniqueName: \"kubernetes.io/projected/0410901f-4718-42a7-9e61-a64722c67b5c-kube-api-access-9d9sr\") pod \"placement-db-create-g8njq\" (UID: \"0410901f-4718-42a7-9e61-a64722c67b5c\") " pod="openstack/placement-db-create-g8njq" Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.355835 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.360982 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d9sr\" (UniqueName: \"kubernetes.io/projected/0410901f-4718-42a7-9e61-a64722c67b5c-kube-api-access-9d9sr\") pod \"placement-db-create-g8njq\" (UID: \"0410901f-4718-42a7-9e61-a64722c67b5c\") " pod="openstack/placement-db-create-g8njq" Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.419836 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-g8njq" Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.569294 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-5b585"] Oct 01 12:53:11 crc kubenswrapper[4913]: W1001 12:53:11.577111 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2cae9e2_ba7f_48a1_b95f_4813bca1ea6e.slice/crio-75eee79535f5ea553b3489431c531c2d2fe6fff6da4aa9f320fdcc7f24c9fd96 WatchSource:0}: Error finding container 75eee79535f5ea553b3489431c531c2d2fe6fff6da4aa9f320fdcc7f24c9fd96: Status 404 returned error can't find the container with id 75eee79535f5ea553b3489431c531c2d2fe6fff6da4aa9f320fdcc7f24c9fd96 Oct 01 12:53:11 crc kubenswrapper[4913]: I1001 12:53:11.897136 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-g8njq"] Oct 01 12:53:11 crc kubenswrapper[4913]: W1001 12:53:11.900102 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0410901f_4718_42a7_9e61_a64722c67b5c.slice/crio-be01cc2be49c8cb3aae7b5fe6fcdf6ebc9778380e26b424bc014d93aca61aab4 WatchSource:0}: Error finding container be01cc2be49c8cb3aae7b5fe6fcdf6ebc9778380e26b424bc014d93aca61aab4: Status 404 returned error can't find the container with id be01cc2be49c8cb3aae7b5fe6fcdf6ebc9778380e26b424bc014d93aca61aab4 Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.166153 4913 generic.go:334] "Generic (PLEG): container finished" podID="19a7d9a6-e81a-483a-8408-6784ded67834" containerID="b702e6d96a91056bec1b26da179fba714390228d163abda83dd7909687a79615" exitCode=0 Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.166216 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19a7d9a6-e81a-483a-8408-6784ded67834","Type":"ContainerDied","Data":"b702e6d96a91056bec1b26da179fba714390228d163abda83dd7909687a79615"} Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.168017 4913 generic.go:334] "Generic (PLEG): container finished" podID="f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e" containerID="09ff27b0fdd6406176781e588fa5b1df77493dc22e94bd0198e7914f25b45a69" exitCode=0 Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.168091 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5b585" event={"ID":"f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e","Type":"ContainerDied","Data":"09ff27b0fdd6406176781e588fa5b1df77493dc22e94bd0198e7914f25b45a69"} Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.168116 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5b585" event={"ID":"f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e","Type":"ContainerStarted","Data":"75eee79535f5ea553b3489431c531c2d2fe6fff6da4aa9f320fdcc7f24c9fd96"} Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.171508 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-g8njq" event={"ID":"0410901f-4718-42a7-9e61-a64722c67b5c","Type":"ContainerStarted","Data":"12b419c58d516e7add950651ec72776dcc9cb68fe99158a4f5f643d6dcd8bfba"} Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.171551 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-g8njq" event={"ID":"0410901f-4718-42a7-9e61-a64722c67b5c","Type":"ContainerStarted","Data":"be01cc2be49c8cb3aae7b5fe6fcdf6ebc9778380e26b424bc014d93aca61aab4"} Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.350413 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.352766 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vjkgr" podUID="e90d4e3a-3c02-4d5d-84f4-32d5cb411f77" containerName="ovn-controller" probeResult="failure" output=< Oct 01 12:53:12 crc kubenswrapper[4913]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 12:53:12 crc kubenswrapper[4913]: > Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.361550 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7ck28" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.561516 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vjkgr-config-8wxtp"] Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.563537 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.565312 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.572667 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vjkgr-config-8wxtp"] Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.681148 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5293b85-d9a0-48e8-808c-1e73e7769c51-scripts\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.681204 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-run-ovn\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.681229 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slfp2\" (UniqueName: \"kubernetes.io/projected/d5293b85-d9a0-48e8-808c-1e73e7769c51-kube-api-access-slfp2\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.681374 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-run\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.681472 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d5293b85-d9a0-48e8-808c-1e73e7769c51-additional-scripts\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.681533 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-log-ovn\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.783598 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-run\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.783643 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d5293b85-d9a0-48e8-808c-1e73e7769c51-additional-scripts\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.783661 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-log-ovn\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.783744 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5293b85-d9a0-48e8-808c-1e73e7769c51-scripts\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.783776 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-run-ovn\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.783801 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slfp2\" (UniqueName: \"kubernetes.io/projected/d5293b85-d9a0-48e8-808c-1e73e7769c51-kube-api-access-slfp2\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.783906 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-run\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.783943 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-log-ovn\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.783987 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-run-ovn\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.784588 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d5293b85-d9a0-48e8-808c-1e73e7769c51-additional-scripts\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.787083 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5293b85-d9a0-48e8-808c-1e73e7769c51-scripts\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.810952 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slfp2\" (UniqueName: \"kubernetes.io/projected/d5293b85-d9a0-48e8-808c-1e73e7769c51-kube-api-access-slfp2\") pod \"ovn-controller-vjkgr-config-8wxtp\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:12 crc kubenswrapper[4913]: I1001 12:53:12.891657 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.181173 4913 generic.go:334] "Generic (PLEG): container finished" podID="0410901f-4718-42a7-9e61-a64722c67b5c" containerID="12b419c58d516e7add950651ec72776dcc9cb68fe99158a4f5f643d6dcd8bfba" exitCode=0 Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.181466 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-g8njq" event={"ID":"0410901f-4718-42a7-9e61-a64722c67b5c","Type":"ContainerDied","Data":"12b419c58d516e7add950651ec72776dcc9cb68fe99158a4f5f643d6dcd8bfba"} Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.185846 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19a7d9a6-e81a-483a-8408-6784ded67834","Type":"ContainerStarted","Data":"17fd7e042b77989a88964c948508ab83e998d19bc322214f5087728d5df3fcaa"} Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.186435 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.187957 4913 generic.go:334] "Generic (PLEG): container finished" podID="eafb0f7f-ea12-4d4f-9097-5923b9345bc0" containerID="0af59876f943b701332c73d88665e99a191224ea85bade54965220f15ee27b27" exitCode=0 Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.188061 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eafb0f7f-ea12-4d4f-9097-5923b9345bc0","Type":"ContainerDied","Data":"0af59876f943b701332c73d88665e99a191224ea85bade54965220f15ee27b27"} Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.209186 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=48.328303849 podStartE2EDuration="57.209171143s" podCreationTimestamp="2025-10-01 12:52:16 +0000 UTC" firstStartedPulling="2025-10-01 12:52:29.101502396 +0000 UTC m=+881.004977974" lastFinishedPulling="2025-10-01 12:52:37.98236969 +0000 UTC m=+889.885845268" observedRunningTime="2025-10-01 12:53:13.20795848 +0000 UTC m=+925.111434068" watchObservedRunningTime="2025-10-01 12:53:13.209171143 +0000 UTC m=+925.112646721" Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.372963 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vjkgr-config-8wxtp"] Oct 01 12:53:13 crc kubenswrapper[4913]: W1001 12:53:13.380524 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5293b85_d9a0_48e8_808c_1e73e7769c51.slice/crio-987165f1e6a256e18ce7a4aa0b64764f4598b98c99c59ddcb67dc85f55503f50 WatchSource:0}: Error finding container 987165f1e6a256e18ce7a4aa0b64764f4598b98c99c59ddcb67dc85f55503f50: Status 404 returned error can't find the container with id 987165f1e6a256e18ce7a4aa0b64764f4598b98c99c59ddcb67dc85f55503f50 Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.505596 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-g8njq" Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.515709 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5b585" Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.609602 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4hh5\" (UniqueName: \"kubernetes.io/projected/f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e-kube-api-access-q4hh5\") pod \"f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e\" (UID: \"f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e\") " Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.610044 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d9sr\" (UniqueName: \"kubernetes.io/projected/0410901f-4718-42a7-9e61-a64722c67b5c-kube-api-access-9d9sr\") pod \"0410901f-4718-42a7-9e61-a64722c67b5c\" (UID: \"0410901f-4718-42a7-9e61-a64722c67b5c\") " Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.615793 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e-kube-api-access-q4hh5" (OuterVolumeSpecName: "kube-api-access-q4hh5") pod "f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e" (UID: "f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e"). InnerVolumeSpecName "kube-api-access-q4hh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.616580 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0410901f-4718-42a7-9e61-a64722c67b5c-kube-api-access-9d9sr" (OuterVolumeSpecName: "kube-api-access-9d9sr") pod "0410901f-4718-42a7-9e61-a64722c67b5c" (UID: "0410901f-4718-42a7-9e61-a64722c67b5c"). InnerVolumeSpecName "kube-api-access-9d9sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.712100 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4hh5\" (UniqueName: \"kubernetes.io/projected/f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e-kube-api-access-q4hh5\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:13 crc kubenswrapper[4913]: I1001 12:53:13.712125 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d9sr\" (UniqueName: \"kubernetes.io/projected/0410901f-4718-42a7-9e61-a64722c67b5c-kube-api-access-9d9sr\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:14 crc kubenswrapper[4913]: I1001 12:53:14.197805 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5b585" event={"ID":"f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e","Type":"ContainerDied","Data":"75eee79535f5ea553b3489431c531c2d2fe6fff6da4aa9f320fdcc7f24c9fd96"} Oct 01 12:53:14 crc kubenswrapper[4913]: I1001 12:53:14.198086 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75eee79535f5ea553b3489431c531c2d2fe6fff6da4aa9f320fdcc7f24c9fd96" Oct 01 12:53:14 crc kubenswrapper[4913]: I1001 12:53:14.197846 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5b585" Oct 01 12:53:14 crc kubenswrapper[4913]: I1001 12:53:14.200437 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eafb0f7f-ea12-4d4f-9097-5923b9345bc0","Type":"ContainerStarted","Data":"0c0d803652573be842be268d202df7cf1b234411e3f3748bb90b65cd3d43d003"} Oct 01 12:53:14 crc kubenswrapper[4913]: I1001 12:53:14.201198 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:53:14 crc kubenswrapper[4913]: I1001 12:53:14.203010 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-g8njq" event={"ID":"0410901f-4718-42a7-9e61-a64722c67b5c","Type":"ContainerDied","Data":"be01cc2be49c8cb3aae7b5fe6fcdf6ebc9778380e26b424bc014d93aca61aab4"} Oct 01 12:53:14 crc kubenswrapper[4913]: I1001 12:53:14.203117 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be01cc2be49c8cb3aae7b5fe6fcdf6ebc9778380e26b424bc014d93aca61aab4" Oct 01 12:53:14 crc kubenswrapper[4913]: I1001 12:53:14.203210 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-g8njq" Oct 01 12:53:14 crc kubenswrapper[4913]: I1001 12:53:14.208882 4913 generic.go:334] "Generic (PLEG): container finished" podID="d5293b85-d9a0-48e8-808c-1e73e7769c51" containerID="ba645eac3745b09349271c20b4f7b9a1300165ad9616229a6de32a710cad40ed" exitCode=0 Oct 01 12:53:14 crc kubenswrapper[4913]: I1001 12:53:14.209327 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vjkgr-config-8wxtp" event={"ID":"d5293b85-d9a0-48e8-808c-1e73e7769c51","Type":"ContainerDied","Data":"ba645eac3745b09349271c20b4f7b9a1300165ad9616229a6de32a710cad40ed"} Oct 01 12:53:14 crc kubenswrapper[4913]: I1001 12:53:14.209360 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vjkgr-config-8wxtp" event={"ID":"d5293b85-d9a0-48e8-808c-1e73e7769c51","Type":"ContainerStarted","Data":"987165f1e6a256e18ce7a4aa0b64764f4598b98c99c59ddcb67dc85f55503f50"} Oct 01 12:53:14 crc kubenswrapper[4913]: I1001 12:53:14.233402 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.151416665 podStartE2EDuration="58.233386551s" podCreationTimestamp="2025-10-01 12:52:16 +0000 UTC" firstStartedPulling="2025-10-01 12:52:29.358322817 +0000 UTC m=+881.261798395" lastFinishedPulling="2025-10-01 12:52:38.440292703 +0000 UTC m=+890.343768281" observedRunningTime="2025-10-01 12:53:14.228564118 +0000 UTC m=+926.132039716" watchObservedRunningTime="2025-10-01 12:53:14.233386551 +0000 UTC m=+926.136862129" Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.496525 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.538773 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5293b85-d9a0-48e8-808c-1e73e7769c51-scripts\") pod \"d5293b85-d9a0-48e8-808c-1e73e7769c51\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.538871 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slfp2\" (UniqueName: \"kubernetes.io/projected/d5293b85-d9a0-48e8-808c-1e73e7769c51-kube-api-access-slfp2\") pod \"d5293b85-d9a0-48e8-808c-1e73e7769c51\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.538900 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d5293b85-d9a0-48e8-808c-1e73e7769c51-additional-scripts\") pod \"d5293b85-d9a0-48e8-808c-1e73e7769c51\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.538960 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-run\") pod \"d5293b85-d9a0-48e8-808c-1e73e7769c51\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.539039 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-log-ovn\") pod \"d5293b85-d9a0-48e8-808c-1e73e7769c51\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.539096 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-run-ovn\") pod \"d5293b85-d9a0-48e8-808c-1e73e7769c51\" (UID: \"d5293b85-d9a0-48e8-808c-1e73e7769c51\") " Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.539102 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-run" (OuterVolumeSpecName: "var-run") pod "d5293b85-d9a0-48e8-808c-1e73e7769c51" (UID: "d5293b85-d9a0-48e8-808c-1e73e7769c51"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.539187 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d5293b85-d9a0-48e8-808c-1e73e7769c51" (UID: "d5293b85-d9a0-48e8-808c-1e73e7769c51"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.539249 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d5293b85-d9a0-48e8-808c-1e73e7769c51" (UID: "d5293b85-d9a0-48e8-808c-1e73e7769c51"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.539403 4913 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.539414 4913 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.539424 4913 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5293b85-d9a0-48e8-808c-1e73e7769c51-var-run\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.539782 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5293b85-d9a0-48e8-808c-1e73e7769c51-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d5293b85-d9a0-48e8-808c-1e73e7769c51" (UID: "d5293b85-d9a0-48e8-808c-1e73e7769c51"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.540078 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5293b85-d9a0-48e8-808c-1e73e7769c51-scripts" (OuterVolumeSpecName: "scripts") pod "d5293b85-d9a0-48e8-808c-1e73e7769c51" (UID: "d5293b85-d9a0-48e8-808c-1e73e7769c51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.550502 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5293b85-d9a0-48e8-808c-1e73e7769c51-kube-api-access-slfp2" (OuterVolumeSpecName: "kube-api-access-slfp2") pod "d5293b85-d9a0-48e8-808c-1e73e7769c51" (UID: "d5293b85-d9a0-48e8-808c-1e73e7769c51"). InnerVolumeSpecName "kube-api-access-slfp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.641000 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5293b85-d9a0-48e8-808c-1e73e7769c51-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.641032 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slfp2\" (UniqueName: \"kubernetes.io/projected/d5293b85-d9a0-48e8-808c-1e73e7769c51-kube-api-access-slfp2\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:15 crc kubenswrapper[4913]: I1001 12:53:15.641041 4913 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d5293b85-d9a0-48e8-808c-1e73e7769c51-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.020538 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.072195 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-rksl6"] Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.072494 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77795d58f5-rksl6" podUID="3e2fa990-1cf6-4826-9bbd-0e02bf405bcb" containerName="dnsmasq-dns" containerID="cri-o://bacfa30e75211038dad6d41161894a529ac701b363016d65c9fc4dc35ffa76c7" gracePeriod=10 Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.264498 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vjkgr-config-8wxtp" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.264489 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vjkgr-config-8wxtp" event={"ID":"d5293b85-d9a0-48e8-808c-1e73e7769c51","Type":"ContainerDied","Data":"987165f1e6a256e18ce7a4aa0b64764f4598b98c99c59ddcb67dc85f55503f50"} Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.264644 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="987165f1e6a256e18ce7a4aa0b64764f4598b98c99c59ddcb67dc85f55503f50" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.266388 4913 generic.go:334] "Generic (PLEG): container finished" podID="3e2fa990-1cf6-4826-9bbd-0e02bf405bcb" containerID="bacfa30e75211038dad6d41161894a529ac701b363016d65c9fc4dc35ffa76c7" exitCode=0 Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.266423 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-rksl6" event={"ID":"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb","Type":"ContainerDied","Data":"bacfa30e75211038dad6d41161894a529ac701b363016d65c9fc4dc35ffa76c7"} Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.409312 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-wbw8j"] Oct 01 12:53:16 crc kubenswrapper[4913]: E1001 12:53:16.409678 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0410901f-4718-42a7-9e61-a64722c67b5c" containerName="mariadb-database-create" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.409699 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0410901f-4718-42a7-9e61-a64722c67b5c" containerName="mariadb-database-create" Oct 01 12:53:16 crc kubenswrapper[4913]: E1001 12:53:16.409732 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5293b85-d9a0-48e8-808c-1e73e7769c51" containerName="ovn-config" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.409741 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5293b85-d9a0-48e8-808c-1e73e7769c51" containerName="ovn-config" Oct 01 12:53:16 crc kubenswrapper[4913]: E1001 12:53:16.409754 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e" containerName="mariadb-database-create" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.409761 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e" containerName="mariadb-database-create" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.409947 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5293b85-d9a0-48e8-808c-1e73e7769c51" containerName="ovn-config" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.409971 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="0410901f-4718-42a7-9e61-a64722c67b5c" containerName="mariadb-database-create" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.409984 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e" containerName="mariadb-database-create" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.411243 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wbw8j" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.421765 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wbw8j"] Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.452533 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqwcm\" (UniqueName: \"kubernetes.io/projected/def990e9-e3a6-44f5-9a22-fcab13b131b0-kube-api-access-mqwcm\") pod \"glance-db-create-wbw8j\" (UID: \"def990e9-e3a6-44f5-9a22-fcab13b131b0\") " pod="openstack/glance-db-create-wbw8j" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.554470 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqwcm\" (UniqueName: \"kubernetes.io/projected/def990e9-e3a6-44f5-9a22-fcab13b131b0-kube-api-access-mqwcm\") pod \"glance-db-create-wbw8j\" (UID: \"def990e9-e3a6-44f5-9a22-fcab13b131b0\") " pod="openstack/glance-db-create-wbw8j" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.579896 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqwcm\" (UniqueName: \"kubernetes.io/projected/def990e9-e3a6-44f5-9a22-fcab13b131b0-kube-api-access-mqwcm\") pod \"glance-db-create-wbw8j\" (UID: \"def990e9-e3a6-44f5-9a22-fcab13b131b0\") " pod="openstack/glance-db-create-wbw8j" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.607963 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vjkgr-config-8wxtp"] Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.613001 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vjkgr-config-8wxtp"] Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.636510 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.731997 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wbw8j" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.758535 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4kbl\" (UniqueName: \"kubernetes.io/projected/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-kube-api-access-h4kbl\") pod \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\" (UID: \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\") " Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.758595 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-config\") pod \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\" (UID: \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\") " Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.758651 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-dns-svc\") pod \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\" (UID: \"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb\") " Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.762611 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-kube-api-access-h4kbl" (OuterVolumeSpecName: "kube-api-access-h4kbl") pod "3e2fa990-1cf6-4826-9bbd-0e02bf405bcb" (UID: "3e2fa990-1cf6-4826-9bbd-0e02bf405bcb"). InnerVolumeSpecName "kube-api-access-h4kbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.806084 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-config" (OuterVolumeSpecName: "config") pod "3e2fa990-1cf6-4826-9bbd-0e02bf405bcb" (UID: "3e2fa990-1cf6-4826-9bbd-0e02bf405bcb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.813236 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3e2fa990-1cf6-4826-9bbd-0e02bf405bcb" (UID: "3e2fa990-1cf6-4826-9bbd-0e02bf405bcb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.817149 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5293b85-d9a0-48e8-808c-1e73e7769c51" path="/var/lib/kubelet/pods/d5293b85-d9a0-48e8-808c-1e73e7769c51/volumes" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.861031 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.861072 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:16 crc kubenswrapper[4913]: I1001 12:53:16.861086 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4kbl\" (UniqueName: \"kubernetes.io/projected/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb-kube-api-access-h4kbl\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:17 crc kubenswrapper[4913]: I1001 12:53:17.206903 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wbw8j"] Oct 01 12:53:17 crc kubenswrapper[4913]: W1001 12:53:17.211680 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddef990e9_e3a6_44f5_9a22_fcab13b131b0.slice/crio-77fb27a3f5ee3bdebf66434668ad1c9a5de59e2e2747dcec7c9fd5689c0ba418 WatchSource:0}: Error finding container 77fb27a3f5ee3bdebf66434668ad1c9a5de59e2e2747dcec7c9fd5689c0ba418: Status 404 returned error can't find the container with id 77fb27a3f5ee3bdebf66434668ad1c9a5de59e2e2747dcec7c9fd5689c0ba418 Oct 01 12:53:17 crc kubenswrapper[4913]: I1001 12:53:17.274393 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wbw8j" event={"ID":"def990e9-e3a6-44f5-9a22-fcab13b131b0","Type":"ContainerStarted","Data":"77fb27a3f5ee3bdebf66434668ad1c9a5de59e2e2747dcec7c9fd5689c0ba418"} Oct 01 12:53:17 crc kubenswrapper[4913]: I1001 12:53:17.276575 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-rksl6" event={"ID":"3e2fa990-1cf6-4826-9bbd-0e02bf405bcb","Type":"ContainerDied","Data":"3eeccee84476b090b7664ed40be9bfc831cd51bf6e621295ba4f48a531023e06"} Oct 01 12:53:17 crc kubenswrapper[4913]: I1001 12:53:17.276620 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77795d58f5-rksl6" Oct 01 12:53:17 crc kubenswrapper[4913]: I1001 12:53:17.276633 4913 scope.go:117] "RemoveContainer" containerID="bacfa30e75211038dad6d41161894a529ac701b363016d65c9fc4dc35ffa76c7" Oct 01 12:53:17 crc kubenswrapper[4913]: I1001 12:53:17.295763 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-rksl6"] Oct 01 12:53:17 crc kubenswrapper[4913]: I1001 12:53:17.297615 4913 scope.go:117] "RemoveContainer" containerID="7ad946ff417de7525a068efb1840e4e101e5106c6fd05cda75c11ec99a2ea8b3" Oct 01 12:53:17 crc kubenswrapper[4913]: I1001 12:53:17.304167 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-rksl6"] Oct 01 12:53:17 crc kubenswrapper[4913]: I1001 12:53:17.361958 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-vjkgr" Oct 01 12:53:18 crc kubenswrapper[4913]: I1001 12:53:18.285465 4913 generic.go:334] "Generic (PLEG): container finished" podID="def990e9-e3a6-44f5-9a22-fcab13b131b0" containerID="c2de77f0dde39eb1b18feac456663932fda6e8cfeff231d0c283c45a50884608" exitCode=0 Oct 01 12:53:18 crc kubenswrapper[4913]: I1001 12:53:18.285513 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wbw8j" event={"ID":"def990e9-e3a6-44f5-9a22-fcab13b131b0","Type":"ContainerDied","Data":"c2de77f0dde39eb1b18feac456663932fda6e8cfeff231d0c283c45a50884608"} Oct 01 12:53:18 crc kubenswrapper[4913]: I1001 12:53:18.817179 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e2fa990-1cf6-4826-9bbd-0e02bf405bcb" path="/var/lib/kubelet/pods/3e2fa990-1cf6-4826-9bbd-0e02bf405bcb/volumes" Oct 01 12:53:19 crc kubenswrapper[4913]: I1001 12:53:19.625089 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wbw8j" Oct 01 12:53:19 crc kubenswrapper[4913]: I1001 12:53:19.704211 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqwcm\" (UniqueName: \"kubernetes.io/projected/def990e9-e3a6-44f5-9a22-fcab13b131b0-kube-api-access-mqwcm\") pod \"def990e9-e3a6-44f5-9a22-fcab13b131b0\" (UID: \"def990e9-e3a6-44f5-9a22-fcab13b131b0\") " Oct 01 12:53:19 crc kubenswrapper[4913]: I1001 12:53:19.709480 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def990e9-e3a6-44f5-9a22-fcab13b131b0-kube-api-access-mqwcm" (OuterVolumeSpecName: "kube-api-access-mqwcm") pod "def990e9-e3a6-44f5-9a22-fcab13b131b0" (UID: "def990e9-e3a6-44f5-9a22-fcab13b131b0"). InnerVolumeSpecName "kube-api-access-mqwcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:19 crc kubenswrapper[4913]: I1001 12:53:19.806295 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqwcm\" (UniqueName: \"kubernetes.io/projected/def990e9-e3a6-44f5-9a22-fcab13b131b0-kube-api-access-mqwcm\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:20 crc kubenswrapper[4913]: I1001 12:53:20.305814 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wbw8j" event={"ID":"def990e9-e3a6-44f5-9a22-fcab13b131b0","Type":"ContainerDied","Data":"77fb27a3f5ee3bdebf66434668ad1c9a5de59e2e2747dcec7c9fd5689c0ba418"} Oct 01 12:53:20 crc kubenswrapper[4913]: I1001 12:53:20.305853 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77fb27a3f5ee3bdebf66434668ad1c9a5de59e2e2747dcec7c9fd5689c0ba418" Oct 01 12:53:20 crc kubenswrapper[4913]: I1001 12:53:20.305887 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wbw8j" Oct 01 12:53:20 crc kubenswrapper[4913]: I1001 12:53:20.865526 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8b7e-account-create-p85hr"] Oct 01 12:53:20 crc kubenswrapper[4913]: E1001 12:53:20.866197 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2fa990-1cf6-4826-9bbd-0e02bf405bcb" containerName="init" Oct 01 12:53:20 crc kubenswrapper[4913]: I1001 12:53:20.866213 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2fa990-1cf6-4826-9bbd-0e02bf405bcb" containerName="init" Oct 01 12:53:20 crc kubenswrapper[4913]: E1001 12:53:20.866229 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2fa990-1cf6-4826-9bbd-0e02bf405bcb" containerName="dnsmasq-dns" Oct 01 12:53:20 crc kubenswrapper[4913]: I1001 12:53:20.866236 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2fa990-1cf6-4826-9bbd-0e02bf405bcb" containerName="dnsmasq-dns" Oct 01 12:53:20 crc kubenswrapper[4913]: E1001 12:53:20.866253 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def990e9-e3a6-44f5-9a22-fcab13b131b0" containerName="mariadb-database-create" Oct 01 12:53:20 crc kubenswrapper[4913]: I1001 12:53:20.866261 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="def990e9-e3a6-44f5-9a22-fcab13b131b0" containerName="mariadb-database-create" Oct 01 12:53:20 crc kubenswrapper[4913]: I1001 12:53:20.866591 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2fa990-1cf6-4826-9bbd-0e02bf405bcb" containerName="dnsmasq-dns" Oct 01 12:53:20 crc kubenswrapper[4913]: I1001 12:53:20.866617 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="def990e9-e3a6-44f5-9a22-fcab13b131b0" containerName="mariadb-database-create" Oct 01 12:53:20 crc kubenswrapper[4913]: I1001 12:53:20.867299 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8b7e-account-create-p85hr" Oct 01 12:53:20 crc kubenswrapper[4913]: I1001 12:53:20.876594 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 01 12:53:20 crc kubenswrapper[4913]: I1001 12:53:20.882810 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8b7e-account-create-p85hr"] Oct 01 12:53:20 crc kubenswrapper[4913]: I1001 12:53:20.920940 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvmqv\" (UniqueName: \"kubernetes.io/projected/afbc2f4d-f870-4b0e-a484-4feefbf89762-kube-api-access-fvmqv\") pod \"keystone-8b7e-account-create-p85hr\" (UID: \"afbc2f4d-f870-4b0e-a484-4feefbf89762\") " pod="openstack/keystone-8b7e-account-create-p85hr" Oct 01 12:53:21 crc kubenswrapper[4913]: I1001 12:53:21.022238 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvmqv\" (UniqueName: \"kubernetes.io/projected/afbc2f4d-f870-4b0e-a484-4feefbf89762-kube-api-access-fvmqv\") pod \"keystone-8b7e-account-create-p85hr\" (UID: \"afbc2f4d-f870-4b0e-a484-4feefbf89762\") " pod="openstack/keystone-8b7e-account-create-p85hr" Oct 01 12:53:21 crc kubenswrapper[4913]: I1001 12:53:21.040737 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvmqv\" (UniqueName: \"kubernetes.io/projected/afbc2f4d-f870-4b0e-a484-4feefbf89762-kube-api-access-fvmqv\") pod \"keystone-8b7e-account-create-p85hr\" (UID: \"afbc2f4d-f870-4b0e-a484-4feefbf89762\") " pod="openstack/keystone-8b7e-account-create-p85hr" Oct 01 12:53:21 crc kubenswrapper[4913]: I1001 12:53:21.189028 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8b7e-account-create-p85hr" Oct 01 12:53:21 crc kubenswrapper[4913]: I1001 12:53:21.269084 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-154d-account-create-kmjfl"] Oct 01 12:53:21 crc kubenswrapper[4913]: I1001 12:53:21.270704 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-154d-account-create-kmjfl" Oct 01 12:53:21 crc kubenswrapper[4913]: I1001 12:53:21.273568 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 01 12:53:21 crc kubenswrapper[4913]: I1001 12:53:21.277047 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-154d-account-create-kmjfl"] Oct 01 12:53:21 crc kubenswrapper[4913]: I1001 12:53:21.328442 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55tm5\" (UniqueName: \"kubernetes.io/projected/792a476e-74a0-4af7-be72-f496b501e22f-kube-api-access-55tm5\") pod \"placement-154d-account-create-kmjfl\" (UID: \"792a476e-74a0-4af7-be72-f496b501e22f\") " pod="openstack/placement-154d-account-create-kmjfl" Oct 01 12:53:21 crc kubenswrapper[4913]: I1001 12:53:21.430874 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55tm5\" (UniqueName: \"kubernetes.io/projected/792a476e-74a0-4af7-be72-f496b501e22f-kube-api-access-55tm5\") pod \"placement-154d-account-create-kmjfl\" (UID: \"792a476e-74a0-4af7-be72-f496b501e22f\") " pod="openstack/placement-154d-account-create-kmjfl" Oct 01 12:53:21 crc kubenswrapper[4913]: I1001 12:53:21.450008 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55tm5\" (UniqueName: \"kubernetes.io/projected/792a476e-74a0-4af7-be72-f496b501e22f-kube-api-access-55tm5\") pod \"placement-154d-account-create-kmjfl\" (UID: \"792a476e-74a0-4af7-be72-f496b501e22f\") " pod="openstack/placement-154d-account-create-kmjfl" Oct 01 12:53:21 crc kubenswrapper[4913]: I1001 12:53:21.610860 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-154d-account-create-kmjfl" Oct 01 12:53:21 crc kubenswrapper[4913]: I1001 12:53:21.641238 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8b7e-account-create-p85hr"] Oct 01 12:53:21 crc kubenswrapper[4913]: W1001 12:53:21.652372 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafbc2f4d_f870_4b0e_a484_4feefbf89762.slice/crio-b5d1d7f7bfb6628f9acd613cf32ff580025898fcb2af8f6c574d8104bf8a556e WatchSource:0}: Error finding container b5d1d7f7bfb6628f9acd613cf32ff580025898fcb2af8f6c574d8104bf8a556e: Status 404 returned error can't find the container with id b5d1d7f7bfb6628f9acd613cf32ff580025898fcb2af8f6c574d8104bf8a556e Oct 01 12:53:21 crc kubenswrapper[4913]: I1001 12:53:21.798821 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-154d-account-create-kmjfl"] Oct 01 12:53:21 crc kubenswrapper[4913]: W1001 12:53:21.805879 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod792a476e_74a0_4af7_be72_f496b501e22f.slice/crio-3ba8445337d8da46e87bf00c0204b985ea6be6a45b2d450b5b15899b4ebb6a7d WatchSource:0}: Error finding container 3ba8445337d8da46e87bf00c0204b985ea6be6a45b2d450b5b15899b4ebb6a7d: Status 404 returned error can't find the container with id 3ba8445337d8da46e87bf00c0204b985ea6be6a45b2d450b5b15899b4ebb6a7d Oct 01 12:53:22 crc kubenswrapper[4913]: I1001 12:53:22.327990 4913 generic.go:334] "Generic (PLEG): container finished" podID="792a476e-74a0-4af7-be72-f496b501e22f" containerID="418eea09dd911901ce85e19b83e523841077d62e27eb30f64fa56d6d516abbd0" exitCode=0 Oct 01 12:53:22 crc kubenswrapper[4913]: I1001 12:53:22.328305 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-154d-account-create-kmjfl" event={"ID":"792a476e-74a0-4af7-be72-f496b501e22f","Type":"ContainerDied","Data":"418eea09dd911901ce85e19b83e523841077d62e27eb30f64fa56d6d516abbd0"} Oct 01 12:53:22 crc kubenswrapper[4913]: I1001 12:53:22.328488 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-154d-account-create-kmjfl" event={"ID":"792a476e-74a0-4af7-be72-f496b501e22f","Type":"ContainerStarted","Data":"3ba8445337d8da46e87bf00c0204b985ea6be6a45b2d450b5b15899b4ebb6a7d"} Oct 01 12:53:22 crc kubenswrapper[4913]: I1001 12:53:22.331828 4913 generic.go:334] "Generic (PLEG): container finished" podID="afbc2f4d-f870-4b0e-a484-4feefbf89762" containerID="bade0a4dacc7ccca93f454ba26c19c88d79bbf33590977279aa5ac8cb8a578d2" exitCode=0 Oct 01 12:53:22 crc kubenswrapper[4913]: I1001 12:53:22.331880 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8b7e-account-create-p85hr" event={"ID":"afbc2f4d-f870-4b0e-a484-4feefbf89762","Type":"ContainerDied","Data":"bade0a4dacc7ccca93f454ba26c19c88d79bbf33590977279aa5ac8cb8a578d2"} Oct 01 12:53:22 crc kubenswrapper[4913]: I1001 12:53:22.331909 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8b7e-account-create-p85hr" event={"ID":"afbc2f4d-f870-4b0e-a484-4feefbf89762","Type":"ContainerStarted","Data":"b5d1d7f7bfb6628f9acd613cf32ff580025898fcb2af8f6c574d8104bf8a556e"} Oct 01 12:53:23 crc kubenswrapper[4913]: I1001 12:53:23.346556 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4ed4667c-0b5b-4e01-b482-4ecb3caebbad","Type":"ContainerStarted","Data":"a0f87a656ff042c7654fc9bf5e8e0cbaa2e7e34ea0a5c891632a65fd2182246a"} Oct 01 12:53:23 crc kubenswrapper[4913]: I1001 12:53:23.347041 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 01 12:53:23 crc kubenswrapper[4913]: I1001 12:53:23.376913 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.4031000279999999 podStartE2EDuration="37.376898612s" podCreationTimestamp="2025-10-01 12:52:46 +0000 UTC" firstStartedPulling="2025-10-01 12:52:46.935957562 +0000 UTC m=+898.839433140" lastFinishedPulling="2025-10-01 12:53:22.909756146 +0000 UTC m=+934.813231724" observedRunningTime="2025-10-01 12:53:23.376721667 +0000 UTC m=+935.280197315" watchObservedRunningTime="2025-10-01 12:53:23.376898612 +0000 UTC m=+935.280374180" Oct 01 12:53:23 crc kubenswrapper[4913]: I1001 12:53:23.714192 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-154d-account-create-kmjfl" Oct 01 12:53:23 crc kubenswrapper[4913]: I1001 12:53:23.722603 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8b7e-account-create-p85hr" Oct 01 12:53:23 crc kubenswrapper[4913]: I1001 12:53:23.812323 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55tm5\" (UniqueName: \"kubernetes.io/projected/792a476e-74a0-4af7-be72-f496b501e22f-kube-api-access-55tm5\") pod \"792a476e-74a0-4af7-be72-f496b501e22f\" (UID: \"792a476e-74a0-4af7-be72-f496b501e22f\") " Oct 01 12:53:23 crc kubenswrapper[4913]: I1001 12:53:23.812426 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvmqv\" (UniqueName: \"kubernetes.io/projected/afbc2f4d-f870-4b0e-a484-4feefbf89762-kube-api-access-fvmqv\") pod \"afbc2f4d-f870-4b0e-a484-4feefbf89762\" (UID: \"afbc2f4d-f870-4b0e-a484-4feefbf89762\") " Oct 01 12:53:23 crc kubenswrapper[4913]: I1001 12:53:23.819073 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792a476e-74a0-4af7-be72-f496b501e22f-kube-api-access-55tm5" (OuterVolumeSpecName: "kube-api-access-55tm5") pod "792a476e-74a0-4af7-be72-f496b501e22f" (UID: "792a476e-74a0-4af7-be72-f496b501e22f"). InnerVolumeSpecName "kube-api-access-55tm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:23 crc kubenswrapper[4913]: I1001 12:53:23.819109 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afbc2f4d-f870-4b0e-a484-4feefbf89762-kube-api-access-fvmqv" (OuterVolumeSpecName: "kube-api-access-fvmqv") pod "afbc2f4d-f870-4b0e-a484-4feefbf89762" (UID: "afbc2f4d-f870-4b0e-a484-4feefbf89762"). InnerVolumeSpecName "kube-api-access-fvmqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:23 crc kubenswrapper[4913]: I1001 12:53:23.914767 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55tm5\" (UniqueName: \"kubernetes.io/projected/792a476e-74a0-4af7-be72-f496b501e22f-kube-api-access-55tm5\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:23 crc kubenswrapper[4913]: I1001 12:53:23.914802 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvmqv\" (UniqueName: \"kubernetes.io/projected/afbc2f4d-f870-4b0e-a484-4feefbf89762-kube-api-access-fvmqv\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:24 crc kubenswrapper[4913]: I1001 12:53:24.359959 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-154d-account-create-kmjfl" event={"ID":"792a476e-74a0-4af7-be72-f496b501e22f","Type":"ContainerDied","Data":"3ba8445337d8da46e87bf00c0204b985ea6be6a45b2d450b5b15899b4ebb6a7d"} Oct 01 12:53:24 crc kubenswrapper[4913]: I1001 12:53:24.360016 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ba8445337d8da46e87bf00c0204b985ea6be6a45b2d450b5b15899b4ebb6a7d" Oct 01 12:53:24 crc kubenswrapper[4913]: I1001 12:53:24.360079 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-154d-account-create-kmjfl" Oct 01 12:53:24 crc kubenswrapper[4913]: I1001 12:53:24.363873 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8b7e-account-create-p85hr" Oct 01 12:53:24 crc kubenswrapper[4913]: I1001 12:53:24.363882 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8b7e-account-create-p85hr" event={"ID":"afbc2f4d-f870-4b0e-a484-4feefbf89762","Type":"ContainerDied","Data":"b5d1d7f7bfb6628f9acd613cf32ff580025898fcb2af8f6c574d8104bf8a556e"} Oct 01 12:53:24 crc kubenswrapper[4913]: I1001 12:53:24.363960 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5d1d7f7bfb6628f9acd613cf32ff580025898fcb2af8f6c574d8104bf8a556e" Oct 01 12:53:26 crc kubenswrapper[4913]: I1001 12:53:26.499019 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-72e2-account-create-xdl2c"] Oct 01 12:53:26 crc kubenswrapper[4913]: E1001 12:53:26.499932 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afbc2f4d-f870-4b0e-a484-4feefbf89762" containerName="mariadb-account-create" Oct 01 12:53:26 crc kubenswrapper[4913]: I1001 12:53:26.499947 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="afbc2f4d-f870-4b0e-a484-4feefbf89762" containerName="mariadb-account-create" Oct 01 12:53:26 crc kubenswrapper[4913]: E1001 12:53:26.499960 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792a476e-74a0-4af7-be72-f496b501e22f" containerName="mariadb-account-create" Oct 01 12:53:26 crc kubenswrapper[4913]: I1001 12:53:26.499970 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="792a476e-74a0-4af7-be72-f496b501e22f" containerName="mariadb-account-create" Oct 01 12:53:26 crc kubenswrapper[4913]: I1001 12:53:26.500134 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="afbc2f4d-f870-4b0e-a484-4feefbf89762" containerName="mariadb-account-create" Oct 01 12:53:26 crc kubenswrapper[4913]: I1001 12:53:26.500152 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="792a476e-74a0-4af7-be72-f496b501e22f" containerName="mariadb-account-create" Oct 01 12:53:26 crc kubenswrapper[4913]: I1001 12:53:26.500698 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-72e2-account-create-xdl2c" Oct 01 12:53:26 crc kubenswrapper[4913]: I1001 12:53:26.503603 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 01 12:53:26 crc kubenswrapper[4913]: I1001 12:53:26.509034 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-72e2-account-create-xdl2c"] Oct 01 12:53:26 crc kubenswrapper[4913]: I1001 12:53:26.558758 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpv99\" (UniqueName: \"kubernetes.io/projected/08b36fbf-6076-4ebf-8d71-6f3f121a9f5f-kube-api-access-kpv99\") pod \"glance-72e2-account-create-xdl2c\" (UID: \"08b36fbf-6076-4ebf-8d71-6f3f121a9f5f\") " pod="openstack/glance-72e2-account-create-xdl2c" Oct 01 12:53:26 crc kubenswrapper[4913]: I1001 12:53:26.660600 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpv99\" (UniqueName: \"kubernetes.io/projected/08b36fbf-6076-4ebf-8d71-6f3f121a9f5f-kube-api-access-kpv99\") pod \"glance-72e2-account-create-xdl2c\" (UID: \"08b36fbf-6076-4ebf-8d71-6f3f121a9f5f\") " pod="openstack/glance-72e2-account-create-xdl2c" Oct 01 12:53:26 crc kubenswrapper[4913]: I1001 12:53:26.681247 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpv99\" (UniqueName: \"kubernetes.io/projected/08b36fbf-6076-4ebf-8d71-6f3f121a9f5f-kube-api-access-kpv99\") pod \"glance-72e2-account-create-xdl2c\" (UID: \"08b36fbf-6076-4ebf-8d71-6f3f121a9f5f\") " pod="openstack/glance-72e2-account-create-xdl2c" Oct 01 12:53:26 crc kubenswrapper[4913]: I1001 12:53:26.821213 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-72e2-account-create-xdl2c" Oct 01 12:53:27 crc kubenswrapper[4913]: I1001 12:53:27.243706 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-72e2-account-create-xdl2c"] Oct 01 12:53:27 crc kubenswrapper[4913]: I1001 12:53:27.392922 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-72e2-account-create-xdl2c" event={"ID":"08b36fbf-6076-4ebf-8d71-6f3f121a9f5f","Type":"ContainerStarted","Data":"265b62dcf9b1c66c12de285578d12a4fce1ff83e18de4f682d38cb628124a500"} Oct 01 12:53:27 crc kubenswrapper[4913]: I1001 12:53:27.832523 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.122875 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.126974 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-l4t7s"] Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.128440 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l4t7s" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.144732 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-l4t7s"] Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.186722 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46qbk\" (UniqueName: \"kubernetes.io/projected/0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d-kube-api-access-46qbk\") pod \"cinder-db-create-l4t7s\" (UID: \"0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d\") " pod="openstack/cinder-db-create-l4t7s" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.227923 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-h8szt"] Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.228966 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h8szt" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.235428 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h8szt"] Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.288887 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q2b6\" (UniqueName: \"kubernetes.io/projected/031164fb-9e5d-42a4-aca9-45ce70c435d7-kube-api-access-4q2b6\") pod \"barbican-db-create-h8szt\" (UID: \"031164fb-9e5d-42a4-aca9-45ce70c435d7\") " pod="openstack/barbican-db-create-h8szt" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.289126 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46qbk\" (UniqueName: \"kubernetes.io/projected/0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d-kube-api-access-46qbk\") pod \"cinder-db-create-l4t7s\" (UID: \"0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d\") " pod="openstack/cinder-db-create-l4t7s" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.319604 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46qbk\" (UniqueName: \"kubernetes.io/projected/0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d-kube-api-access-46qbk\") pod \"cinder-db-create-l4t7s\" (UID: \"0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d\") " pod="openstack/cinder-db-create-l4t7s" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.390503 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q2b6\" (UniqueName: \"kubernetes.io/projected/031164fb-9e5d-42a4-aca9-45ce70c435d7-kube-api-access-4q2b6\") pod \"barbican-db-create-h8szt\" (UID: \"031164fb-9e5d-42a4-aca9-45ce70c435d7\") " pod="openstack/barbican-db-create-h8szt" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.400021 4913 generic.go:334] "Generic (PLEG): container finished" podID="08b36fbf-6076-4ebf-8d71-6f3f121a9f5f" containerID="ea0ce763dd128e8cab6b4cf28b697295ede7e0ba785bf094156e1dc64b216484" exitCode=0 Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.400118 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-72e2-account-create-xdl2c" event={"ID":"08b36fbf-6076-4ebf-8d71-6f3f121a9f5f","Type":"ContainerDied","Data":"ea0ce763dd128e8cab6b4cf28b697295ede7e0ba785bf094156e1dc64b216484"} Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.408324 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q2b6\" (UniqueName: \"kubernetes.io/projected/031164fb-9e5d-42a4-aca9-45ce70c435d7-kube-api-access-4q2b6\") pod \"barbican-db-create-h8szt\" (UID: \"031164fb-9e5d-42a4-aca9-45ce70c435d7\") " pod="openstack/barbican-db-create-h8szt" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.430940 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-482r5"] Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.431854 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-482r5" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.437540 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-482r5"] Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.450782 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l4t7s" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.492804 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfcxj\" (UniqueName: \"kubernetes.io/projected/736d359f-862f-434b-855d-4e7152a297a3-kube-api-access-tfcxj\") pod \"neutron-db-create-482r5\" (UID: \"736d359f-862f-434b-855d-4e7152a297a3\") " pod="openstack/neutron-db-create-482r5" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.542973 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h8szt" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.595144 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfcxj\" (UniqueName: \"kubernetes.io/projected/736d359f-862f-434b-855d-4e7152a297a3-kube-api-access-tfcxj\") pod \"neutron-db-create-482r5\" (UID: \"736d359f-862f-434b-855d-4e7152a297a3\") " pod="openstack/neutron-db-create-482r5" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.622112 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfcxj\" (UniqueName: \"kubernetes.io/projected/736d359f-862f-434b-855d-4e7152a297a3-kube-api-access-tfcxj\") pod \"neutron-db-create-482r5\" (UID: \"736d359f-862f-434b-855d-4e7152a297a3\") " pod="openstack/neutron-db-create-482r5" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.626195 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-qslcs"] Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.636145 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qslcs"] Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.636291 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qslcs" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.639710 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.640666 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lbn2j" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.640841 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.641427 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.698185 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d726dfa-1763-4ae9-999a-2b58c91ae988-combined-ca-bundle\") pod \"keystone-db-sync-qslcs\" (UID: \"0d726dfa-1763-4ae9-999a-2b58c91ae988\") " pod="openstack/keystone-db-sync-qslcs" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.698233 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsbs2\" (UniqueName: \"kubernetes.io/projected/0d726dfa-1763-4ae9-999a-2b58c91ae988-kube-api-access-xsbs2\") pod \"keystone-db-sync-qslcs\" (UID: \"0d726dfa-1763-4ae9-999a-2b58c91ae988\") " pod="openstack/keystone-db-sync-qslcs" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.698316 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d726dfa-1763-4ae9-999a-2b58c91ae988-config-data\") pod \"keystone-db-sync-qslcs\" (UID: \"0d726dfa-1763-4ae9-999a-2b58c91ae988\") " pod="openstack/keystone-db-sync-qslcs" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.767865 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-482r5" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.803113 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d726dfa-1763-4ae9-999a-2b58c91ae988-config-data\") pod \"keystone-db-sync-qslcs\" (UID: \"0d726dfa-1763-4ae9-999a-2b58c91ae988\") " pod="openstack/keystone-db-sync-qslcs" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.803224 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d726dfa-1763-4ae9-999a-2b58c91ae988-combined-ca-bundle\") pod \"keystone-db-sync-qslcs\" (UID: \"0d726dfa-1763-4ae9-999a-2b58c91ae988\") " pod="openstack/keystone-db-sync-qslcs" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.803271 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsbs2\" (UniqueName: \"kubernetes.io/projected/0d726dfa-1763-4ae9-999a-2b58c91ae988-kube-api-access-xsbs2\") pod \"keystone-db-sync-qslcs\" (UID: \"0d726dfa-1763-4ae9-999a-2b58c91ae988\") " pod="openstack/keystone-db-sync-qslcs" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.807903 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d726dfa-1763-4ae9-999a-2b58c91ae988-combined-ca-bundle\") pod \"keystone-db-sync-qslcs\" (UID: \"0d726dfa-1763-4ae9-999a-2b58c91ae988\") " pod="openstack/keystone-db-sync-qslcs" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.823363 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsbs2\" (UniqueName: \"kubernetes.io/projected/0d726dfa-1763-4ae9-999a-2b58c91ae988-kube-api-access-xsbs2\") pod \"keystone-db-sync-qslcs\" (UID: \"0d726dfa-1763-4ae9-999a-2b58c91ae988\") " pod="openstack/keystone-db-sync-qslcs" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.824898 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d726dfa-1763-4ae9-999a-2b58c91ae988-config-data\") pod \"keystone-db-sync-qslcs\" (UID: \"0d726dfa-1763-4ae9-999a-2b58c91ae988\") " pod="openstack/keystone-db-sync-qslcs" Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.923164 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-l4t7s"] Oct 01 12:53:28 crc kubenswrapper[4913]: W1001 12:53:28.928301 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a55cd8b_17c4_480f_85f8_2dd8bd1d5e9d.slice/crio-db16821fdc449d5d58c5e4381b5de0903f482d25daa038b8e60eaff73ee18c98 WatchSource:0}: Error finding container db16821fdc449d5d58c5e4381b5de0903f482d25daa038b8e60eaff73ee18c98: Status 404 returned error can't find the container with id db16821fdc449d5d58c5e4381b5de0903f482d25daa038b8e60eaff73ee18c98 Oct 01 12:53:28 crc kubenswrapper[4913]: I1001 12:53:28.963404 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qslcs" Oct 01 12:53:29 crc kubenswrapper[4913]: I1001 12:53:29.045457 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h8szt"] Oct 01 12:53:29 crc kubenswrapper[4913]: W1001 12:53:29.054894 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod031164fb_9e5d_42a4_aca9_45ce70c435d7.slice/crio-5f585a9206001648d443f89a8fc46a42ce11e26451dee722607f2f71ea175b79 WatchSource:0}: Error finding container 5f585a9206001648d443f89a8fc46a42ce11e26451dee722607f2f71ea175b79: Status 404 returned error can't find the container with id 5f585a9206001648d443f89a8fc46a42ce11e26451dee722607f2f71ea175b79 Oct 01 12:53:29 crc kubenswrapper[4913]: I1001 12:53:29.204089 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-482r5"] Oct 01 12:53:29 crc kubenswrapper[4913]: W1001 12:53:29.211176 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod736d359f_862f_434b_855d_4e7152a297a3.slice/crio-8b66009bf3d262bb4bd7db931a3f9c43b4c06403668a9054c9fe39f0c6b8e29d WatchSource:0}: Error finding container 8b66009bf3d262bb4bd7db931a3f9c43b4c06403668a9054c9fe39f0c6b8e29d: Status 404 returned error can't find the container with id 8b66009bf3d262bb4bd7db931a3f9c43b4c06403668a9054c9fe39f0c6b8e29d Oct 01 12:53:29 crc kubenswrapper[4913]: I1001 12:53:29.375894 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qslcs"] Oct 01 12:53:29 crc kubenswrapper[4913]: W1001 12:53:29.387751 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d726dfa_1763_4ae9_999a_2b58c91ae988.slice/crio-08f42788114d959dc206b4ed4bfb00991520451af3fad5e5b9dafe444d8885c8 WatchSource:0}: Error finding container 08f42788114d959dc206b4ed4bfb00991520451af3fad5e5b9dafe444d8885c8: Status 404 returned error can't find the container with id 08f42788114d959dc206b4ed4bfb00991520451af3fad5e5b9dafe444d8885c8 Oct 01 12:53:29 crc kubenswrapper[4913]: I1001 12:53:29.409692 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-482r5" event={"ID":"736d359f-862f-434b-855d-4e7152a297a3","Type":"ContainerStarted","Data":"8b66009bf3d262bb4bd7db931a3f9c43b4c06403668a9054c9fe39f0c6b8e29d"} Oct 01 12:53:29 crc kubenswrapper[4913]: I1001 12:53:29.410887 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qslcs" event={"ID":"0d726dfa-1763-4ae9-999a-2b58c91ae988","Type":"ContainerStarted","Data":"08f42788114d959dc206b4ed4bfb00991520451af3fad5e5b9dafe444d8885c8"} Oct 01 12:53:29 crc kubenswrapper[4913]: I1001 12:53:29.412840 4913 generic.go:334] "Generic (PLEG): container finished" podID="031164fb-9e5d-42a4-aca9-45ce70c435d7" containerID="fa1367da838eb1d9d7ad7f0f3c984cd02576420e126bf863a1071c8477ad0550" exitCode=0 Oct 01 12:53:29 crc kubenswrapper[4913]: I1001 12:53:29.412896 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h8szt" event={"ID":"031164fb-9e5d-42a4-aca9-45ce70c435d7","Type":"ContainerDied","Data":"fa1367da838eb1d9d7ad7f0f3c984cd02576420e126bf863a1071c8477ad0550"} Oct 01 12:53:29 crc kubenswrapper[4913]: I1001 12:53:29.412915 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h8szt" event={"ID":"031164fb-9e5d-42a4-aca9-45ce70c435d7","Type":"ContainerStarted","Data":"5f585a9206001648d443f89a8fc46a42ce11e26451dee722607f2f71ea175b79"} Oct 01 12:53:29 crc kubenswrapper[4913]: I1001 12:53:29.415022 4913 generic.go:334] "Generic (PLEG): container finished" podID="0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d" containerID="7772479e635746bd28e22825db2711638f4f5e1dde8a95d8437381ed62339858" exitCode=0 Oct 01 12:53:29 crc kubenswrapper[4913]: I1001 12:53:29.415189 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l4t7s" event={"ID":"0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d","Type":"ContainerDied","Data":"7772479e635746bd28e22825db2711638f4f5e1dde8a95d8437381ed62339858"} Oct 01 12:53:29 crc kubenswrapper[4913]: I1001 12:53:29.415216 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l4t7s" event={"ID":"0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d","Type":"ContainerStarted","Data":"db16821fdc449d5d58c5e4381b5de0903f482d25daa038b8e60eaff73ee18c98"} Oct 01 12:53:29 crc kubenswrapper[4913]: I1001 12:53:29.792447 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-72e2-account-create-xdl2c" Oct 01 12:53:29 crc kubenswrapper[4913]: I1001 12:53:29.921507 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpv99\" (UniqueName: \"kubernetes.io/projected/08b36fbf-6076-4ebf-8d71-6f3f121a9f5f-kube-api-access-kpv99\") pod \"08b36fbf-6076-4ebf-8d71-6f3f121a9f5f\" (UID: \"08b36fbf-6076-4ebf-8d71-6f3f121a9f5f\") " Oct 01 12:53:29 crc kubenswrapper[4913]: I1001 12:53:29.926391 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08b36fbf-6076-4ebf-8d71-6f3f121a9f5f-kube-api-access-kpv99" (OuterVolumeSpecName: "kube-api-access-kpv99") pod "08b36fbf-6076-4ebf-8d71-6f3f121a9f5f" (UID: "08b36fbf-6076-4ebf-8d71-6f3f121a9f5f"). InnerVolumeSpecName "kube-api-access-kpv99". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:30 crc kubenswrapper[4913]: I1001 12:53:30.024211 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpv99\" (UniqueName: \"kubernetes.io/projected/08b36fbf-6076-4ebf-8d71-6f3f121a9f5f-kube-api-access-kpv99\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:30 crc kubenswrapper[4913]: I1001 12:53:30.428678 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-72e2-account-create-xdl2c" event={"ID":"08b36fbf-6076-4ebf-8d71-6f3f121a9f5f","Type":"ContainerDied","Data":"265b62dcf9b1c66c12de285578d12a4fce1ff83e18de4f682d38cb628124a500"} Oct 01 12:53:30 crc kubenswrapper[4913]: I1001 12:53:30.428979 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="265b62dcf9b1c66c12de285578d12a4fce1ff83e18de4f682d38cb628124a500" Oct 01 12:53:30 crc kubenswrapper[4913]: I1001 12:53:30.428722 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-72e2-account-create-xdl2c" Oct 01 12:53:30 crc kubenswrapper[4913]: I1001 12:53:30.430955 4913 generic.go:334] "Generic (PLEG): container finished" podID="736d359f-862f-434b-855d-4e7152a297a3" containerID="c587b6ca356b4a0324937018127b81faa7d65a8fc75955af7a271868a5467ebd" exitCode=0 Oct 01 12:53:30 crc kubenswrapper[4913]: I1001 12:53:30.431073 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-482r5" event={"ID":"736d359f-862f-434b-855d-4e7152a297a3","Type":"ContainerDied","Data":"c587b6ca356b4a0324937018127b81faa7d65a8fc75955af7a271868a5467ebd"} Oct 01 12:53:30 crc kubenswrapper[4913]: I1001 12:53:30.866639 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h8szt" Oct 01 12:53:30 crc kubenswrapper[4913]: I1001 12:53:30.873245 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l4t7s" Oct 01 12:53:30 crc kubenswrapper[4913]: I1001 12:53:30.938193 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46qbk\" (UniqueName: \"kubernetes.io/projected/0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d-kube-api-access-46qbk\") pod \"0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d\" (UID: \"0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d\") " Oct 01 12:53:30 crc kubenswrapper[4913]: I1001 12:53:30.938282 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q2b6\" (UniqueName: \"kubernetes.io/projected/031164fb-9e5d-42a4-aca9-45ce70c435d7-kube-api-access-4q2b6\") pod \"031164fb-9e5d-42a4-aca9-45ce70c435d7\" (UID: \"031164fb-9e5d-42a4-aca9-45ce70c435d7\") " Oct 01 12:53:30 crc kubenswrapper[4913]: I1001 12:53:30.942612 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d-kube-api-access-46qbk" (OuterVolumeSpecName: "kube-api-access-46qbk") pod "0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d" (UID: "0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d"). InnerVolumeSpecName "kube-api-access-46qbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:30 crc kubenswrapper[4913]: I1001 12:53:30.943419 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/031164fb-9e5d-42a4-aca9-45ce70c435d7-kube-api-access-4q2b6" (OuterVolumeSpecName: "kube-api-access-4q2b6") pod "031164fb-9e5d-42a4-aca9-45ce70c435d7" (UID: "031164fb-9e5d-42a4-aca9-45ce70c435d7"). InnerVolumeSpecName "kube-api-access-4q2b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.042823 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46qbk\" (UniqueName: \"kubernetes.io/projected/0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d-kube-api-access-46qbk\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.043831 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q2b6\" (UniqueName: \"kubernetes.io/projected/031164fb-9e5d-42a4-aca9-45ce70c435d7-kube-api-access-4q2b6\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.438436 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h8szt" event={"ID":"031164fb-9e5d-42a4-aca9-45ce70c435d7","Type":"ContainerDied","Data":"5f585a9206001648d443f89a8fc46a42ce11e26451dee722607f2f71ea175b79"} Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.438473 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f585a9206001648d443f89a8fc46a42ce11e26451dee722607f2f71ea175b79" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.438516 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h8szt" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.445789 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l4t7s" event={"ID":"0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d","Type":"ContainerDied","Data":"db16821fdc449d5d58c5e4381b5de0903f482d25daa038b8e60eaff73ee18c98"} Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.445821 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l4t7s" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.445830 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db16821fdc449d5d58c5e4381b5de0903f482d25daa038b8e60eaff73ee18c98" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.621567 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-b6l67"] Oct 01 12:53:31 crc kubenswrapper[4913]: E1001 12:53:31.621880 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d" containerName="mariadb-database-create" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.621896 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d" containerName="mariadb-database-create" Oct 01 12:53:31 crc kubenswrapper[4913]: E1001 12:53:31.621919 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08b36fbf-6076-4ebf-8d71-6f3f121a9f5f" containerName="mariadb-account-create" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.621926 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="08b36fbf-6076-4ebf-8d71-6f3f121a9f5f" containerName="mariadb-account-create" Oct 01 12:53:31 crc kubenswrapper[4913]: E1001 12:53:31.621943 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031164fb-9e5d-42a4-aca9-45ce70c435d7" containerName="mariadb-database-create" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.621949 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="031164fb-9e5d-42a4-aca9-45ce70c435d7" containerName="mariadb-database-create" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.622105 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="08b36fbf-6076-4ebf-8d71-6f3f121a9f5f" containerName="mariadb-account-create" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.622119 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d" containerName="mariadb-database-create" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.622140 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="031164fb-9e5d-42a4-aca9-45ce70c435d7" containerName="mariadb-database-create" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.622667 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b6l67" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.626999 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tjvjz" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.627235 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.634507 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-b6l67"] Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.758323 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-db-sync-config-data\") pod \"glance-db-sync-b6l67\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " pod="openstack/glance-db-sync-b6l67" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.758417 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-config-data\") pod \"glance-db-sync-b6l67\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " pod="openstack/glance-db-sync-b6l67" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.758490 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2qlb\" (UniqueName: \"kubernetes.io/projected/57296118-560c-4764-b94a-472d8467f7c0-kube-api-access-s2qlb\") pod \"glance-db-sync-b6l67\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " pod="openstack/glance-db-sync-b6l67" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.758796 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-combined-ca-bundle\") pod \"glance-db-sync-b6l67\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " pod="openstack/glance-db-sync-b6l67" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.868204 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-combined-ca-bundle\") pod \"glance-db-sync-b6l67\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " pod="openstack/glance-db-sync-b6l67" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.868299 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-db-sync-config-data\") pod \"glance-db-sync-b6l67\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " pod="openstack/glance-db-sync-b6l67" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.868356 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-config-data\") pod \"glance-db-sync-b6l67\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " pod="openstack/glance-db-sync-b6l67" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.868408 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2qlb\" (UniqueName: \"kubernetes.io/projected/57296118-560c-4764-b94a-472d8467f7c0-kube-api-access-s2qlb\") pod \"glance-db-sync-b6l67\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " pod="openstack/glance-db-sync-b6l67" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.881210 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-combined-ca-bundle\") pod \"glance-db-sync-b6l67\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " pod="openstack/glance-db-sync-b6l67" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.884299 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-config-data\") pod \"glance-db-sync-b6l67\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " pod="openstack/glance-db-sync-b6l67" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.889533 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2qlb\" (UniqueName: \"kubernetes.io/projected/57296118-560c-4764-b94a-472d8467f7c0-kube-api-access-s2qlb\") pod \"glance-db-sync-b6l67\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " pod="openstack/glance-db-sync-b6l67" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.889867 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-db-sync-config-data\") pod \"glance-db-sync-b6l67\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " pod="openstack/glance-db-sync-b6l67" Oct 01 12:53:31 crc kubenswrapper[4913]: I1001 12:53:31.947421 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b6l67" Oct 01 12:53:34 crc kubenswrapper[4913]: I1001 12:53:34.485666 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-482r5" event={"ID":"736d359f-862f-434b-855d-4e7152a297a3","Type":"ContainerDied","Data":"8b66009bf3d262bb4bd7db931a3f9c43b4c06403668a9054c9fe39f0c6b8e29d"} Oct 01 12:53:34 crc kubenswrapper[4913]: I1001 12:53:34.485927 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b66009bf3d262bb4bd7db931a3f9c43b4c06403668a9054c9fe39f0c6b8e29d" Oct 01 12:53:34 crc kubenswrapper[4913]: I1001 12:53:34.658631 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-482r5" Oct 01 12:53:34 crc kubenswrapper[4913]: I1001 12:53:34.716665 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfcxj\" (UniqueName: \"kubernetes.io/projected/736d359f-862f-434b-855d-4e7152a297a3-kube-api-access-tfcxj\") pod \"736d359f-862f-434b-855d-4e7152a297a3\" (UID: \"736d359f-862f-434b-855d-4e7152a297a3\") " Oct 01 12:53:34 crc kubenswrapper[4913]: I1001 12:53:34.722734 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736d359f-862f-434b-855d-4e7152a297a3-kube-api-access-tfcxj" (OuterVolumeSpecName: "kube-api-access-tfcxj") pod "736d359f-862f-434b-855d-4e7152a297a3" (UID: "736d359f-862f-434b-855d-4e7152a297a3"). InnerVolumeSpecName "kube-api-access-tfcxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:34 crc kubenswrapper[4913]: I1001 12:53:34.818308 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfcxj\" (UniqueName: \"kubernetes.io/projected/736d359f-862f-434b-855d-4e7152a297a3-kube-api-access-tfcxj\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:35 crc kubenswrapper[4913]: I1001 12:53:35.133317 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-b6l67"] Oct 01 12:53:35 crc kubenswrapper[4913]: I1001 12:53:35.494743 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qslcs" event={"ID":"0d726dfa-1763-4ae9-999a-2b58c91ae988","Type":"ContainerStarted","Data":"afc7ec16485816182a90a00e377ec239fb4a1a1c7c8e35bdc014cc43119ce7a0"} Oct 01 12:53:35 crc kubenswrapper[4913]: I1001 12:53:35.495548 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-482r5" Oct 01 12:53:35 crc kubenswrapper[4913]: I1001 12:53:35.495549 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b6l67" event={"ID":"57296118-560c-4764-b94a-472d8467f7c0","Type":"ContainerStarted","Data":"4635b0c2957a72496e904a97f09d3fdb273ebf452f6c4a8f75efac241057deea"} Oct 01 12:53:35 crc kubenswrapper[4913]: I1001 12:53:35.513000 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-qslcs" podStartSLOduration=2.326821175 podStartE2EDuration="7.512985391s" podCreationTimestamp="2025-10-01 12:53:28 +0000 UTC" firstStartedPulling="2025-10-01 12:53:29.390558137 +0000 UTC m=+941.294033715" lastFinishedPulling="2025-10-01 12:53:34.576722353 +0000 UTC m=+946.480197931" observedRunningTime="2025-10-01 12:53:35.509651919 +0000 UTC m=+947.413127517" watchObservedRunningTime="2025-10-01 12:53:35.512985391 +0000 UTC m=+947.416460969" Oct 01 12:53:36 crc kubenswrapper[4913]: I1001 12:53:36.540175 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 01 12:53:37 crc kubenswrapper[4913]: I1001 12:53:37.510912 4913 generic.go:334] "Generic (PLEG): container finished" podID="0d726dfa-1763-4ae9-999a-2b58c91ae988" containerID="afc7ec16485816182a90a00e377ec239fb4a1a1c7c8e35bdc014cc43119ce7a0" exitCode=0 Oct 01 12:53:37 crc kubenswrapper[4913]: I1001 12:53:37.510961 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qslcs" event={"ID":"0d726dfa-1763-4ae9-999a-2b58c91ae988","Type":"ContainerDied","Data":"afc7ec16485816182a90a00e377ec239fb4a1a1c7c8e35bdc014cc43119ce7a0"} Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.276869 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-5ae0-account-create-rvj2c"] Oct 01 12:53:38 crc kubenswrapper[4913]: E1001 12:53:38.277470 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736d359f-862f-434b-855d-4e7152a297a3" containerName="mariadb-database-create" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.277482 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="736d359f-862f-434b-855d-4e7152a297a3" containerName="mariadb-database-create" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.277626 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="736d359f-862f-434b-855d-4e7152a297a3" containerName="mariadb-database-create" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.278114 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5ae0-account-create-rvj2c" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.279756 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.288151 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5ae0-account-create-rvj2c"] Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.374306 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2409-account-create-8xltx"] Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.375648 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2409-account-create-8xltx" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.377702 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.390644 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cx4c\" (UniqueName: \"kubernetes.io/projected/f222cfe0-8537-43b8-a6be-b18bd7bbcaff-kube-api-access-2cx4c\") pod \"cinder-5ae0-account-create-rvj2c\" (UID: \"f222cfe0-8537-43b8-a6be-b18bd7bbcaff\") " pod="openstack/cinder-5ae0-account-create-rvj2c" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.392457 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2409-account-create-8xltx"] Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.492038 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k7rb\" (UniqueName: \"kubernetes.io/projected/a31c09d0-abac-44cc-9d6d-94c05a99e577-kube-api-access-4k7rb\") pod \"barbican-2409-account-create-8xltx\" (UID: \"a31c09d0-abac-44cc-9d6d-94c05a99e577\") " pod="openstack/barbican-2409-account-create-8xltx" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.492133 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cx4c\" (UniqueName: \"kubernetes.io/projected/f222cfe0-8537-43b8-a6be-b18bd7bbcaff-kube-api-access-2cx4c\") pod \"cinder-5ae0-account-create-rvj2c\" (UID: \"f222cfe0-8537-43b8-a6be-b18bd7bbcaff\") " pod="openstack/cinder-5ae0-account-create-rvj2c" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.535252 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cx4c\" (UniqueName: \"kubernetes.io/projected/f222cfe0-8537-43b8-a6be-b18bd7bbcaff-kube-api-access-2cx4c\") pod \"cinder-5ae0-account-create-rvj2c\" (UID: \"f222cfe0-8537-43b8-a6be-b18bd7bbcaff\") " pod="openstack/cinder-5ae0-account-create-rvj2c" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.593697 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k7rb\" (UniqueName: \"kubernetes.io/projected/a31c09d0-abac-44cc-9d6d-94c05a99e577-kube-api-access-4k7rb\") pod \"barbican-2409-account-create-8xltx\" (UID: \"a31c09d0-abac-44cc-9d6d-94c05a99e577\") " pod="openstack/barbican-2409-account-create-8xltx" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.609980 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k7rb\" (UniqueName: \"kubernetes.io/projected/a31c09d0-abac-44cc-9d6d-94c05a99e577-kube-api-access-4k7rb\") pod \"barbican-2409-account-create-8xltx\" (UID: \"a31c09d0-abac-44cc-9d6d-94c05a99e577\") " pod="openstack/barbican-2409-account-create-8xltx" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.648411 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5ae0-account-create-rvj2c" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.695182 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2409-account-create-8xltx" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.802306 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qslcs" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.901058 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d726dfa-1763-4ae9-999a-2b58c91ae988-combined-ca-bundle\") pod \"0d726dfa-1763-4ae9-999a-2b58c91ae988\" (UID: \"0d726dfa-1763-4ae9-999a-2b58c91ae988\") " Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.901315 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d726dfa-1763-4ae9-999a-2b58c91ae988-config-data\") pod \"0d726dfa-1763-4ae9-999a-2b58c91ae988\" (UID: \"0d726dfa-1763-4ae9-999a-2b58c91ae988\") " Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.901370 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsbs2\" (UniqueName: \"kubernetes.io/projected/0d726dfa-1763-4ae9-999a-2b58c91ae988-kube-api-access-xsbs2\") pod \"0d726dfa-1763-4ae9-999a-2b58c91ae988\" (UID: \"0d726dfa-1763-4ae9-999a-2b58c91ae988\") " Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.906387 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d726dfa-1763-4ae9-999a-2b58c91ae988-kube-api-access-xsbs2" (OuterVolumeSpecName: "kube-api-access-xsbs2") pod "0d726dfa-1763-4ae9-999a-2b58c91ae988" (UID: "0d726dfa-1763-4ae9-999a-2b58c91ae988"). InnerVolumeSpecName "kube-api-access-xsbs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.928057 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d726dfa-1763-4ae9-999a-2b58c91ae988-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d726dfa-1763-4ae9-999a-2b58c91ae988" (UID: "0d726dfa-1763-4ae9-999a-2b58c91ae988"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:38 crc kubenswrapper[4913]: I1001 12:53:38.951423 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d726dfa-1763-4ae9-999a-2b58c91ae988-config-data" (OuterVolumeSpecName: "config-data") pod "0d726dfa-1763-4ae9-999a-2b58c91ae988" (UID: "0d726dfa-1763-4ae9-999a-2b58c91ae988"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.003554 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d726dfa-1763-4ae9-999a-2b58c91ae988-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.003595 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d726dfa-1763-4ae9-999a-2b58c91ae988-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.003608 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsbs2\" (UniqueName: \"kubernetes.io/projected/0d726dfa-1763-4ae9-999a-2b58c91ae988-kube-api-access-xsbs2\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:39 crc kubenswrapper[4913]: W1001 12:53:39.155845 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf222cfe0_8537_43b8_a6be_b18bd7bbcaff.slice/crio-e38f2e690025d06ad7e44526210c8acef17238b702181c3a8bdc08040b3d67a6 WatchSource:0}: Error finding container e38f2e690025d06ad7e44526210c8acef17238b702181c3a8bdc08040b3d67a6: Status 404 returned error can't find the container with id e38f2e690025d06ad7e44526210c8acef17238b702181c3a8bdc08040b3d67a6 Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.160526 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5ae0-account-create-rvj2c"] Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.206887 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2409-account-create-8xltx"] Oct 01 12:53:39 crc kubenswrapper[4913]: W1001 12:53:39.207119 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda31c09d0_abac_44cc_9d6d_94c05a99e577.slice/crio-4ee8ee853ccaed3429cfeaa2c689395e0e2d289c3d46fa282210b75af2aa7344 WatchSource:0}: Error finding container 4ee8ee853ccaed3429cfeaa2c689395e0e2d289c3d46fa282210b75af2aa7344: Status 404 returned error can't find the container with id 4ee8ee853ccaed3429cfeaa2c689395e0e2d289c3d46fa282210b75af2aa7344 Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.557791 4913 generic.go:334] "Generic (PLEG): container finished" podID="a31c09d0-abac-44cc-9d6d-94c05a99e577" containerID="9a21abe3da8a5b9eeca15c45c28c09716e7d2662ab769ebdfb3c32ea7e246b3a" exitCode=0 Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.558106 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2409-account-create-8xltx" event={"ID":"a31c09d0-abac-44cc-9d6d-94c05a99e577","Type":"ContainerDied","Data":"9a21abe3da8a5b9eeca15c45c28c09716e7d2662ab769ebdfb3c32ea7e246b3a"} Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.558133 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2409-account-create-8xltx" event={"ID":"a31c09d0-abac-44cc-9d6d-94c05a99e577","Type":"ContainerStarted","Data":"4ee8ee853ccaed3429cfeaa2c689395e0e2d289c3d46fa282210b75af2aa7344"} Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.569668 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qslcs" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.569734 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qslcs" event={"ID":"0d726dfa-1763-4ae9-999a-2b58c91ae988","Type":"ContainerDied","Data":"08f42788114d959dc206b4ed4bfb00991520451af3fad5e5b9dafe444d8885c8"} Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.569776 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08f42788114d959dc206b4ed4bfb00991520451af3fad5e5b9dafe444d8885c8" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.578170 4913 generic.go:334] "Generic (PLEG): container finished" podID="f222cfe0-8537-43b8-a6be-b18bd7bbcaff" containerID="618aad5f2cb8064047d1555435cf75f5b94e2d99a72c38de318bd12ea5565563" exitCode=0 Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.578215 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5ae0-account-create-rvj2c" event={"ID":"f222cfe0-8537-43b8-a6be-b18bd7bbcaff","Type":"ContainerDied","Data":"618aad5f2cb8064047d1555435cf75f5b94e2d99a72c38de318bd12ea5565563"} Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.578241 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5ae0-account-create-rvj2c" event={"ID":"f222cfe0-8537-43b8-a6be-b18bd7bbcaff","Type":"ContainerStarted","Data":"e38f2e690025d06ad7e44526210c8acef17238b702181c3a8bdc08040b3d67a6"} Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.803251 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bcx4f"] Oct 01 12:53:39 crc kubenswrapper[4913]: E1001 12:53:39.805513 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d726dfa-1763-4ae9-999a-2b58c91ae988" containerName="keystone-db-sync" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.805545 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d726dfa-1763-4ae9-999a-2b58c91ae988" containerName="keystone-db-sync" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.807345 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d726dfa-1763-4ae9-999a-2b58c91ae988" containerName="keystone-db-sync" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.807985 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.812450 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-744bbc95cf-4zm5x"] Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.814109 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.814604 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.814925 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.815061 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.816652 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lbn2j" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.834773 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bcx4f"] Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.866370 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744bbc95cf-4zm5x"] Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.921198 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-combined-ca-bundle\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.921248 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2l54\" (UniqueName: \"kubernetes.io/projected/dcb997eb-469c-4436-9ac8-e814b6488263-kube-api-access-q2l54\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.921306 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-dns-svc\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.921339 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-credential-keys\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.921436 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-config-data\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.921464 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-ovsdbserver-nb\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.921490 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsxhk\" (UniqueName: \"kubernetes.io/projected/fee1115e-9951-48e7-89c0-b5c676716145-kube-api-access-rsxhk\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.921528 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-ovsdbserver-sb\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.921568 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-fernet-keys\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.921591 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-config\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.921615 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-scripts\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.960808 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5f465bcb97-7bzq4"] Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.962183 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.966891 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.967051 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.967200 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-kcg9w" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.967739 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 01 12:53:39 crc kubenswrapper[4913]: I1001 12:53:39.988480 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f465bcb97-7bzq4"] Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.023515 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-dns-svc\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.023812 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-credential-keys\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.023853 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50a601b3-1f16-4504-bc7c-aa573c34764e-logs\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.023877 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-config-data\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.023900 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-ovsdbserver-nb\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.023921 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsxhk\" (UniqueName: \"kubernetes.io/projected/fee1115e-9951-48e7-89c0-b5c676716145-kube-api-access-rsxhk\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.023948 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-ovsdbserver-sb\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.023968 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-fernet-keys\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.023984 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-config\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.024002 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-scripts\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.024032 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngmtl\" (UniqueName: \"kubernetes.io/projected/50a601b3-1f16-4504-bc7c-aa573c34764e-kube-api-access-ngmtl\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.024053 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/50a601b3-1f16-4504-bc7c-aa573c34764e-horizon-secret-key\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.024087 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50a601b3-1f16-4504-bc7c-aa573c34764e-scripts\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.024112 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-combined-ca-bundle\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.024127 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50a601b3-1f16-4504-bc7c-aa573c34764e-config-data\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.024144 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2l54\" (UniqueName: \"kubernetes.io/projected/dcb997eb-469c-4436-9ac8-e814b6488263-kube-api-access-q2l54\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.024359 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-dns-svc\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.024903 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-config\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.025615 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-ovsdbserver-nb\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.026216 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-ovsdbserver-sb\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.039082 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-credential-keys\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.041547 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-fernet-keys\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.044935 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-combined-ca-bundle\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.051352 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-scripts\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.057811 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-config-data\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.058062 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.061640 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.072809 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.073560 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.074961 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.076228 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsxhk\" (UniqueName: \"kubernetes.io/projected/fee1115e-9951-48e7-89c0-b5c676716145-kube-api-access-rsxhk\") pod \"dnsmasq-dns-744bbc95cf-4zm5x\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.083949 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.084009 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.084054 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.084104 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2l54\" (UniqueName: \"kubernetes.io/projected/dcb997eb-469c-4436-9ac8-e814b6488263-kube-api-access-q2l54\") pod \"keystone-bootstrap-bcx4f\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.084769 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"799ca569c504b87f0203003c8051d299d1a44d32ea3031c0c1940d1be3fbaa96"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.084840 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://799ca569c504b87f0203003c8051d299d1a44d32ea3031c0c1940d1be3fbaa96" gracePeriod=600 Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.126589 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngmtl\" (UniqueName: \"kubernetes.io/projected/50a601b3-1f16-4504-bc7c-aa573c34764e-kube-api-access-ngmtl\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.126633 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6gmq\" (UniqueName: \"kubernetes.io/projected/e43a3fef-6dd6-4239-b6de-028dfa7145fc-kube-api-access-q6gmq\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.126658 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/50a601b3-1f16-4504-bc7c-aa573c34764e-horizon-secret-key\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.126692 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50a601b3-1f16-4504-bc7c-aa573c34764e-scripts\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.126713 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e43a3fef-6dd6-4239-b6de-028dfa7145fc-run-httpd\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.126735 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50a601b3-1f16-4504-bc7c-aa573c34764e-config-data\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.126775 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-scripts\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.126791 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-config-data\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.126807 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.126824 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50a601b3-1f16-4504-bc7c-aa573c34764e-logs\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.126841 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.126864 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e43a3fef-6dd6-4239-b6de-028dfa7145fc-log-httpd\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.127531 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50a601b3-1f16-4504-bc7c-aa573c34764e-logs\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.128703 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50a601b3-1f16-4504-bc7c-aa573c34764e-config-data\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.129013 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50a601b3-1f16-4504-bc7c-aa573c34764e-scripts\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.135308 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/50a601b3-1f16-4504-bc7c-aa573c34764e-horizon-secret-key\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.135654 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.145127 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744bbc95cf-4zm5x"] Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.145941 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.161826 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8r5fv"] Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.162909 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.166398 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngmtl\" (UniqueName: \"kubernetes.io/projected/50a601b3-1f16-4504-bc7c-aa573c34764e-kube-api-access-ngmtl\") pod \"horizon-5f465bcb97-7bzq4\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.167548 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tfwq6" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.167759 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.167864 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.187633 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7df6769d97-bz6l5"] Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.189058 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.207897 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8r5fv"] Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.227778 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.228063 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.228154 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-config-data\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.228256 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e43a3fef-6dd6-4239-b6de-028dfa7145fc-log-httpd\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.228365 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-scripts\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.228503 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6gmq\" (UniqueName: \"kubernetes.io/projected/e43a3fef-6dd6-4239-b6de-028dfa7145fc-kube-api-access-q6gmq\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.228613 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e43a3fef-6dd6-4239-b6de-028dfa7145fc-run-httpd\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.228691 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3950662-6b64-4585-8cb2-8c94623a3d66-logs\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.228771 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjzj6\" (UniqueName: \"kubernetes.io/projected/b3950662-6b64-4585-8cb2-8c94623a3d66-kube-api-access-pjzj6\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.228894 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-combined-ca-bundle\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.228983 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-scripts\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.229061 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-config-data\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.231248 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e43a3fef-6dd6-4239-b6de-028dfa7145fc-run-httpd\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.231608 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e43a3fef-6dd6-4239-b6de-028dfa7145fc-log-httpd\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.231669 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.237766 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.238715 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-config-data\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.243193 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-scripts\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.246518 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77bdb78dfc-7qtdk"] Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.250319 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.251600 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6gmq\" (UniqueName: \"kubernetes.io/projected/e43a3fef-6dd6-4239-b6de-028dfa7145fc-kube-api-access-q6gmq\") pod \"ceilometer-0\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.258285 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7df6769d97-bz6l5"] Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.265491 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77bdb78dfc-7qtdk"] Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.301883 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.332220 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-config-data\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.332481 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80c570d0-d665-4680-a6e5-b4c7734a87af-scripts\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.332571 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80c570d0-d665-4680-a6e5-b4c7734a87af-horizon-secret-key\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.332600 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-scripts\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.332622 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80c570d0-d665-4680-a6e5-b4c7734a87af-config-data\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.332647 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-ovsdbserver-nb\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.334374 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-config\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.335999 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80c570d0-d665-4680-a6e5-b4c7734a87af-logs\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.336024 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3950662-6b64-4585-8cb2-8c94623a3d66-logs\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.336053 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjzj6\" (UniqueName: \"kubernetes.io/projected/b3950662-6b64-4585-8cb2-8c94623a3d66-kube-api-access-pjzj6\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.336117 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-ovsdbserver-sb\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.336156 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p6f9\" (UniqueName: \"kubernetes.io/projected/80c570d0-d665-4680-a6e5-b4c7734a87af-kube-api-access-8p6f9\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.336201 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrvzh\" (UniqueName: \"kubernetes.io/projected/60087e2e-0b0a-4a75-97be-912e06c0b17a-kube-api-access-mrvzh\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.336232 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-dns-svc\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.336256 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-combined-ca-bundle\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.340630 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-scripts\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.341823 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3950662-6b64-4585-8cb2-8c94623a3d66-logs\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.343074 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-combined-ca-bundle\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.347567 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-config-data\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.359811 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjzj6\" (UniqueName: \"kubernetes.io/projected/b3950662-6b64-4585-8cb2-8c94623a3d66-kube-api-access-pjzj6\") pod \"placement-db-sync-8r5fv\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.437729 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-config\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.437810 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80c570d0-d665-4680-a6e5-b4c7734a87af-logs\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.437838 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-ovsdbserver-sb\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.437860 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6f9\" (UniqueName: \"kubernetes.io/projected/80c570d0-d665-4680-a6e5-b4c7734a87af-kube-api-access-8p6f9\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.437883 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrvzh\" (UniqueName: \"kubernetes.io/projected/60087e2e-0b0a-4a75-97be-912e06c0b17a-kube-api-access-mrvzh\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.437900 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-dns-svc\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.437931 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80c570d0-d665-4680-a6e5-b4c7734a87af-scripts\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.437953 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80c570d0-d665-4680-a6e5-b4c7734a87af-horizon-secret-key\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.437973 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80c570d0-d665-4680-a6e5-b4c7734a87af-config-data\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.437992 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-ovsdbserver-nb\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.438580 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-config\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.438739 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-ovsdbserver-nb\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.439260 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-dns-svc\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.439342 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-ovsdbserver-sb\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.439769 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80c570d0-d665-4680-a6e5-b4c7734a87af-scripts\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.440389 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80c570d0-d665-4680-a6e5-b4c7734a87af-logs\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.440677 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80c570d0-d665-4680-a6e5-b4c7734a87af-config-data\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.444975 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80c570d0-d665-4680-a6e5-b4c7734a87af-horizon-secret-key\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.454038 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrvzh\" (UniqueName: \"kubernetes.io/projected/60087e2e-0b0a-4a75-97be-912e06c0b17a-kube-api-access-mrvzh\") pod \"dnsmasq-dns-77bdb78dfc-7qtdk\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.466414 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p6f9\" (UniqueName: \"kubernetes.io/projected/80c570d0-d665-4680-a6e5-b4c7734a87af-kube-api-access-8p6f9\") pod \"horizon-7df6769d97-bz6l5\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.534876 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.551588 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8r5fv" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.591210 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.618612 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.648133 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="799ca569c504b87f0203003c8051d299d1a44d32ea3031c0c1940d1be3fbaa96" exitCode=0 Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.648202 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"799ca569c504b87f0203003c8051d299d1a44d32ea3031c0c1940d1be3fbaa96"} Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.648279 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"b8ccdeaa9feae2c057a74d4a7cb5ae3ea008156e58af6b9bb65a4673a0aaa9d4"} Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.648301 4913 scope.go:117] "RemoveContainer" containerID="770bb111d4d76e645ce0db85174ab71c7357ffc9ba302bee6b549ccfcb148bea" Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.693188 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bcx4f"] Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.857008 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744bbc95cf-4zm5x"] Oct 01 12:53:40 crc kubenswrapper[4913]: I1001 12:53:40.871000 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f465bcb97-7bzq4"] Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.161636 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5ae0-account-create-rvj2c" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.216072 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2409-account-create-8xltx" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.253244 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cx4c\" (UniqueName: \"kubernetes.io/projected/f222cfe0-8537-43b8-a6be-b18bd7bbcaff-kube-api-access-2cx4c\") pod \"f222cfe0-8537-43b8-a6be-b18bd7bbcaff\" (UID: \"f222cfe0-8537-43b8-a6be-b18bd7bbcaff\") " Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.253431 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k7rb\" (UniqueName: \"kubernetes.io/projected/a31c09d0-abac-44cc-9d6d-94c05a99e577-kube-api-access-4k7rb\") pod \"a31c09d0-abac-44cc-9d6d-94c05a99e577\" (UID: \"a31c09d0-abac-44cc-9d6d-94c05a99e577\") " Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.257982 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31c09d0-abac-44cc-9d6d-94c05a99e577-kube-api-access-4k7rb" (OuterVolumeSpecName: "kube-api-access-4k7rb") pod "a31c09d0-abac-44cc-9d6d-94c05a99e577" (UID: "a31c09d0-abac-44cc-9d6d-94c05a99e577"). InnerVolumeSpecName "kube-api-access-4k7rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.258446 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f222cfe0-8537-43b8-a6be-b18bd7bbcaff-kube-api-access-2cx4c" (OuterVolumeSpecName: "kube-api-access-2cx4c") pod "f222cfe0-8537-43b8-a6be-b18bd7bbcaff" (UID: "f222cfe0-8537-43b8-a6be-b18bd7bbcaff"). InnerVolumeSpecName "kube-api-access-2cx4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.273989 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:53:41 crc kubenswrapper[4913]: W1001 12:53:41.283533 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode43a3fef_6dd6_4239_b6de_028dfa7145fc.slice/crio-27f4c22dfc763acad225aad9466c734126458adcf4287ab84e44ff70d73ab501 WatchSource:0}: Error finding container 27f4c22dfc763acad225aad9466c734126458adcf4287ab84e44ff70d73ab501: Status 404 returned error can't find the container with id 27f4c22dfc763acad225aad9466c734126458adcf4287ab84e44ff70d73ab501 Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.356236 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cx4c\" (UniqueName: \"kubernetes.io/projected/f222cfe0-8537-43b8-a6be-b18bd7bbcaff-kube-api-access-2cx4c\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.356265 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4k7rb\" (UniqueName: \"kubernetes.io/projected/a31c09d0-abac-44cc-9d6d-94c05a99e577-kube-api-access-4k7rb\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:41 crc kubenswrapper[4913]: E1001 12:53:41.383326 4913 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee1115e_9951_48e7_89c0_b5c676716145.slice/crio-c0c02d8eca4c42c7b3eabc674bcdad60a870fa1093e326cf2ecab4f5c7a50d6a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee1115e_9951_48e7_89c0_b5c676716145.slice/crio-conmon-c0c02d8eca4c42c7b3eabc674bcdad60a870fa1093e326cf2ecab4f5c7a50d6a.scope\": RecentStats: unable to find data in memory cache]" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.456006 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8r5fv"] Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.474314 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77bdb78dfc-7qtdk"] Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.585542 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7df6769d97-bz6l5"] Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.659485 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e43a3fef-6dd6-4239-b6de-028dfa7145fc","Type":"ContainerStarted","Data":"27f4c22dfc763acad225aad9466c734126458adcf4287ab84e44ff70d73ab501"} Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.661220 4913 generic.go:334] "Generic (PLEG): container finished" podID="fee1115e-9951-48e7-89c0-b5c676716145" containerID="c0c02d8eca4c42c7b3eabc674bcdad60a870fa1093e326cf2ecab4f5c7a50d6a" exitCode=0 Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.661360 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" event={"ID":"fee1115e-9951-48e7-89c0-b5c676716145","Type":"ContainerDied","Data":"c0c02d8eca4c42c7b3eabc674bcdad60a870fa1093e326cf2ecab4f5c7a50d6a"} Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.661739 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" event={"ID":"fee1115e-9951-48e7-89c0-b5c676716145","Type":"ContainerStarted","Data":"eeb869785d48b260d6fd91c4ce1b9b934c4ee23e70003612f05345d291537c2f"} Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.663902 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5ae0-account-create-rvj2c" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.663892 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5ae0-account-create-rvj2c" event={"ID":"f222cfe0-8537-43b8-a6be-b18bd7bbcaff","Type":"ContainerDied","Data":"e38f2e690025d06ad7e44526210c8acef17238b702181c3a8bdc08040b3d67a6"} Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.664405 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e38f2e690025d06ad7e44526210c8acef17238b702181c3a8bdc08040b3d67a6" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.665095 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f465bcb97-7bzq4" event={"ID":"50a601b3-1f16-4504-bc7c-aa573c34764e","Type":"ContainerStarted","Data":"9d3a8d277c00bec18bb47349d8fdb2fcd336a437998d17676844375df249d9d6"} Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.666358 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bcx4f" event={"ID":"dcb997eb-469c-4436-9ac8-e814b6488263","Type":"ContainerStarted","Data":"cd8de2d4fdb47138d9c8b7c9ec4d871b0d65877ad8eef06ab76e7383fcc813db"} Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.666385 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bcx4f" event={"ID":"dcb997eb-469c-4436-9ac8-e814b6488263","Type":"ContainerStarted","Data":"96d9cfd048b364f67dd44ef4043e1c04e85c512b405c301003269468d1d54370"} Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.667702 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2409-account-create-8xltx" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.667703 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2409-account-create-8xltx" event={"ID":"a31c09d0-abac-44cc-9d6d-94c05a99e577","Type":"ContainerDied","Data":"4ee8ee853ccaed3429cfeaa2c689395e0e2d289c3d46fa282210b75af2aa7344"} Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.667808 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ee8ee853ccaed3429cfeaa2c689395e0e2d289c3d46fa282210b75af2aa7344" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.671156 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7df6769d97-bz6l5" event={"ID":"80c570d0-d665-4680-a6e5-b4c7734a87af","Type":"ContainerStarted","Data":"86e4321f2011037ff95bf4661a73063795966fcb84490fac33e04314daf4e951"} Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.672021 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" event={"ID":"60087e2e-0b0a-4a75-97be-912e06c0b17a","Type":"ContainerStarted","Data":"e1a8d0833f2f0740453843a3319514f7ad8a00bec93652aaca490aec89ffe5d9"} Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.672857 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8r5fv" event={"ID":"b3950662-6b64-4585-8cb2-8c94623a3d66","Type":"ContainerStarted","Data":"885970f33c0db7405af2b604c9248730a9f669c7c4ac2077665cb1ac7ba071f5"} Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.704812 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bcx4f" podStartSLOduration=2.7047949940000002 podStartE2EDuration="2.704794994s" podCreationTimestamp="2025-10-01 12:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:41.699090948 +0000 UTC m=+953.602566546" watchObservedRunningTime="2025-10-01 12:53:41.704794994 +0000 UTC m=+953.608270572" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.951341 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f465bcb97-7bzq4"] Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.985704 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c4466fb5f-6vzh5"] Oct 01 12:53:41 crc kubenswrapper[4913]: E1001 12:53:41.986048 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f222cfe0-8537-43b8-a6be-b18bd7bbcaff" containerName="mariadb-account-create" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.986066 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="f222cfe0-8537-43b8-a6be-b18bd7bbcaff" containerName="mariadb-account-create" Oct 01 12:53:41 crc kubenswrapper[4913]: E1001 12:53:41.986081 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a31c09d0-abac-44cc-9d6d-94c05a99e577" containerName="mariadb-account-create" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.986087 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="a31c09d0-abac-44cc-9d6d-94c05a99e577" containerName="mariadb-account-create" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.986256 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="f222cfe0-8537-43b8-a6be-b18bd7bbcaff" containerName="mariadb-account-create" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.986286 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="a31c09d0-abac-44cc-9d6d-94c05a99e577" containerName="mariadb-account-create" Oct 01 12:53:41 crc kubenswrapper[4913]: I1001 12:53:41.987088 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.017161 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c4466fb5f-6vzh5"] Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.093650 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.095251 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f58bd451-a408-4ec8-908e-255afe71b949-logs\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.095348 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f58bd451-a408-4ec8-908e-255afe71b949-horizon-secret-key\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.095415 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f58bd451-a408-4ec8-908e-255afe71b949-config-data\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.095452 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58bd451-a408-4ec8-908e-255afe71b949-scripts\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.095471 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx98l\" (UniqueName: \"kubernetes.io/projected/f58bd451-a408-4ec8-908e-255afe71b949-kube-api-access-hx98l\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.198232 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f58bd451-a408-4ec8-908e-255afe71b949-config-data\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.198356 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58bd451-a408-4ec8-908e-255afe71b949-scripts\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.198401 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx98l\" (UniqueName: \"kubernetes.io/projected/f58bd451-a408-4ec8-908e-255afe71b949-kube-api-access-hx98l\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.198463 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f58bd451-a408-4ec8-908e-255afe71b949-logs\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.199014 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f58bd451-a408-4ec8-908e-255afe71b949-horizon-secret-key\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.200481 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58bd451-a408-4ec8-908e-255afe71b949-scripts\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.200810 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f58bd451-a408-4ec8-908e-255afe71b949-logs\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.202502 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f58bd451-a408-4ec8-908e-255afe71b949-config-data\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.205828 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f58bd451-a408-4ec8-908e-255afe71b949-horizon-secret-key\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.223470 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx98l\" (UniqueName: \"kubernetes.io/projected/f58bd451-a408-4ec8-908e-255afe71b949-kube-api-access-hx98l\") pod \"horizon-5c4466fb5f-6vzh5\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.330677 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.683994 4913 generic.go:334] "Generic (PLEG): container finished" podID="60087e2e-0b0a-4a75-97be-912e06c0b17a" containerID="6173fd595b3f63b80b92d1efe7011d693c1aafa543de6b8df47e1603e6d83636" exitCode=0 Oct 01 12:53:42 crc kubenswrapper[4913]: I1001 12:53:42.684579 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" event={"ID":"60087e2e-0b0a-4a75-97be-912e06c0b17a","Type":"ContainerDied","Data":"6173fd595b3f63b80b92d1efe7011d693c1aafa543de6b8df47e1603e6d83636"} Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.536702 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-qzddd"] Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.537965 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.543077 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.543579 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.545132 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kgnr4" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.551622 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qzddd"] Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.623246 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-config-data\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.623356 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-db-sync-config-data\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.623380 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-etc-machine-id\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.623401 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-scripts\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.623431 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-combined-ca-bundle\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.623637 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swkz5\" (UniqueName: \"kubernetes.io/projected/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-kube-api-access-swkz5\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.725121 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-config-data\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.725165 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-db-sync-config-data\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.725183 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-etc-machine-id\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.725207 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-scripts\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.725240 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-combined-ca-bundle\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.725292 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swkz5\" (UniqueName: \"kubernetes.io/projected/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-kube-api-access-swkz5\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.725306 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-etc-machine-id\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.732027 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-scripts\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.732887 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-combined-ca-bundle\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.737326 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-config-data\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.743406 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-db-sync-config-data\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.786052 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swkz5\" (UniqueName: \"kubernetes.io/projected/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-kube-api-access-swkz5\") pod \"cinder-db-sync-qzddd\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.856688 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qzddd" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.870744 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-x9gmn"] Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.871948 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x9gmn" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.879677 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-x9gmn"] Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.882731 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-sl6r4" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.885553 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.929989 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-db-sync-config-data\") pod \"barbican-db-sync-x9gmn\" (UID: \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\") " pod="openstack/barbican-db-sync-x9gmn" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.930050 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-combined-ca-bundle\") pod \"barbican-db-sync-x9gmn\" (UID: \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\") " pod="openstack/barbican-db-sync-x9gmn" Oct 01 12:53:43 crc kubenswrapper[4913]: I1001 12:53:43.930084 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfztb\" (UniqueName: \"kubernetes.io/projected/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-kube-api-access-wfztb\") pod \"barbican-db-sync-x9gmn\" (UID: \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\") " pod="openstack/barbican-db-sync-x9gmn" Oct 01 12:53:44 crc kubenswrapper[4913]: I1001 12:53:44.031656 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-db-sync-config-data\") pod \"barbican-db-sync-x9gmn\" (UID: \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\") " pod="openstack/barbican-db-sync-x9gmn" Oct 01 12:53:44 crc kubenswrapper[4913]: I1001 12:53:44.031700 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-combined-ca-bundle\") pod \"barbican-db-sync-x9gmn\" (UID: \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\") " pod="openstack/barbican-db-sync-x9gmn" Oct 01 12:53:44 crc kubenswrapper[4913]: I1001 12:53:44.031772 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfztb\" (UniqueName: \"kubernetes.io/projected/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-kube-api-access-wfztb\") pod \"barbican-db-sync-x9gmn\" (UID: \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\") " pod="openstack/barbican-db-sync-x9gmn" Oct 01 12:53:44 crc kubenswrapper[4913]: I1001 12:53:44.035764 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-combined-ca-bundle\") pod \"barbican-db-sync-x9gmn\" (UID: \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\") " pod="openstack/barbican-db-sync-x9gmn" Oct 01 12:53:44 crc kubenswrapper[4913]: I1001 12:53:44.037729 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-db-sync-config-data\") pod \"barbican-db-sync-x9gmn\" (UID: \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\") " pod="openstack/barbican-db-sync-x9gmn" Oct 01 12:53:44 crc kubenswrapper[4913]: I1001 12:53:44.063111 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfztb\" (UniqueName: \"kubernetes.io/projected/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-kube-api-access-wfztb\") pod \"barbican-db-sync-x9gmn\" (UID: \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\") " pod="openstack/barbican-db-sync-x9gmn" Oct 01 12:53:44 crc kubenswrapper[4913]: I1001 12:53:44.195324 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x9gmn" Oct 01 12:53:44 crc kubenswrapper[4913]: I1001 12:53:44.719592 4913 generic.go:334] "Generic (PLEG): container finished" podID="dcb997eb-469c-4436-9ac8-e814b6488263" containerID="cd8de2d4fdb47138d9c8b7c9ec4d871b0d65877ad8eef06ab76e7383fcc813db" exitCode=0 Oct 01 12:53:44 crc kubenswrapper[4913]: I1001 12:53:44.719642 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bcx4f" event={"ID":"dcb997eb-469c-4436-9ac8-e814b6488263","Type":"ContainerDied","Data":"cd8de2d4fdb47138d9c8b7c9ec4d871b0d65877ad8eef06ab76e7383fcc813db"} Oct 01 12:53:48 crc kubenswrapper[4913]: I1001 12:53:48.598033 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f41e-account-create-5hkwk"] Oct 01 12:53:48 crc kubenswrapper[4913]: I1001 12:53:48.599673 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f41e-account-create-5hkwk" Oct 01 12:53:48 crc kubenswrapper[4913]: I1001 12:53:48.601578 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 01 12:53:48 crc kubenswrapper[4913]: I1001 12:53:48.604162 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f41e-account-create-5hkwk"] Oct 01 12:53:48 crc kubenswrapper[4913]: I1001 12:53:48.775752 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql75g\" (UniqueName: \"kubernetes.io/projected/3523af18-f212-4eeb-8e62-5fabc32a4e6c-kube-api-access-ql75g\") pod \"neutron-f41e-account-create-5hkwk\" (UID: \"3523af18-f212-4eeb-8e62-5fabc32a4e6c\") " pod="openstack/neutron-f41e-account-create-5hkwk" Oct 01 12:53:48 crc kubenswrapper[4913]: I1001 12:53:48.877624 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql75g\" (UniqueName: \"kubernetes.io/projected/3523af18-f212-4eeb-8e62-5fabc32a4e6c-kube-api-access-ql75g\") pod \"neutron-f41e-account-create-5hkwk\" (UID: \"3523af18-f212-4eeb-8e62-5fabc32a4e6c\") " pod="openstack/neutron-f41e-account-create-5hkwk" Oct 01 12:53:48 crc kubenswrapper[4913]: I1001 12:53:48.913751 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql75g\" (UniqueName: \"kubernetes.io/projected/3523af18-f212-4eeb-8e62-5fabc32a4e6c-kube-api-access-ql75g\") pod \"neutron-f41e-account-create-5hkwk\" (UID: \"3523af18-f212-4eeb-8e62-5fabc32a4e6c\") " pod="openstack/neutron-f41e-account-create-5hkwk" Oct 01 12:53:48 crc kubenswrapper[4913]: I1001 12:53:48.932578 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f41e-account-create-5hkwk" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.050041 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7df6769d97-bz6l5"] Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.093958 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b6c9764d-c6wjv"] Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.099023 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.102089 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.113073 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b6c9764d-c6wjv"] Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.166437 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c4466fb5f-6vzh5"] Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.183353 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-logs\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.183485 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-horizon-secret-key\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.183534 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-horizon-tls-certs\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.183566 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9b9j\" (UniqueName: \"kubernetes.io/projected/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-kube-api-access-v9b9j\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.183613 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-combined-ca-bundle\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.183663 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-config-data\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.183685 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-scripts\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.198763 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-969db9cf8-b2hmw"] Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.200220 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.221545 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-969db9cf8-b2hmw"] Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.287378 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-config-data\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.287422 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-scripts\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.287450 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-logs\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.287483 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e67e15e0-4c9f-492c-b38c-7955b5830285-scripts\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.287512 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e67e15e0-4c9f-492c-b38c-7955b5830285-horizon-tls-certs\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.287541 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e67e15e0-4c9f-492c-b38c-7955b5830285-horizon-secret-key\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.287580 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e67e15e0-4c9f-492c-b38c-7955b5830285-logs\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.287617 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-horizon-secret-key\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.287647 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e67e15e0-4c9f-492c-b38c-7955b5830285-config-data\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.287674 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-horizon-tls-certs\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.287701 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m248v\" (UniqueName: \"kubernetes.io/projected/e67e15e0-4c9f-492c-b38c-7955b5830285-kube-api-access-m248v\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.287719 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9b9j\" (UniqueName: \"kubernetes.io/projected/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-kube-api-access-v9b9j\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.287757 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-combined-ca-bundle\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.287777 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e67e15e0-4c9f-492c-b38c-7955b5830285-combined-ca-bundle\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.288857 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-scripts\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.288853 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-logs\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.291369 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-config-data\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.294934 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-combined-ca-bundle\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.295088 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-horizon-tls-certs\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.301800 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-horizon-secret-key\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.326823 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9b9j\" (UniqueName: \"kubernetes.io/projected/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-kube-api-access-v9b9j\") pod \"horizon-7b6c9764d-c6wjv\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.388832 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e67e15e0-4c9f-492c-b38c-7955b5830285-combined-ca-bundle\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.388896 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e67e15e0-4c9f-492c-b38c-7955b5830285-scripts\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.388918 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e67e15e0-4c9f-492c-b38c-7955b5830285-horizon-tls-certs\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.388944 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e67e15e0-4c9f-492c-b38c-7955b5830285-horizon-secret-key\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.388996 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e67e15e0-4c9f-492c-b38c-7955b5830285-logs\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.389040 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e67e15e0-4c9f-492c-b38c-7955b5830285-config-data\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.389441 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e67e15e0-4c9f-492c-b38c-7955b5830285-logs\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.389837 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e67e15e0-4c9f-492c-b38c-7955b5830285-scripts\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.390397 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e67e15e0-4c9f-492c-b38c-7955b5830285-config-data\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.390482 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m248v\" (UniqueName: \"kubernetes.io/projected/e67e15e0-4c9f-492c-b38c-7955b5830285-kube-api-access-m248v\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.392358 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e67e15e0-4c9f-492c-b38c-7955b5830285-horizon-secret-key\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.392729 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e67e15e0-4c9f-492c-b38c-7955b5830285-combined-ca-bundle\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.393223 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e67e15e0-4c9f-492c-b38c-7955b5830285-horizon-tls-certs\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.405018 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m248v\" (UniqueName: \"kubernetes.io/projected/e67e15e0-4c9f-492c-b38c-7955b5830285-kube-api-access-m248v\") pod \"horizon-969db9cf8-b2hmw\" (UID: \"e67e15e0-4c9f-492c-b38c-7955b5830285\") " pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.421031 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:53:49 crc kubenswrapper[4913]: I1001 12:53:49.527712 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:53:51 crc kubenswrapper[4913]: E1001 12:53:51.936110 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:bdfed2a176a064bf70082602a1f319eace2d9003ff1117b1e48b7f2130840070" Oct 01 12:53:51 crc kubenswrapper[4913]: E1001 12:53:51.936726 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:bdfed2a176a064bf70082602a1f319eace2d9003ff1117b1e48b7f2130840070,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2qlb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-b6l67_openstack(57296118-560c-4764-b94a-472d8467f7c0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:53:51 crc kubenswrapper[4913]: E1001 12:53:51.938332 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-b6l67" podUID="57296118-560c-4764-b94a-472d8467f7c0" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.009032 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.013957 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.035037 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-credential-keys\") pod \"dcb997eb-469c-4436-9ac8-e814b6488263\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.035120 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsxhk\" (UniqueName: \"kubernetes.io/projected/fee1115e-9951-48e7-89c0-b5c676716145-kube-api-access-rsxhk\") pod \"fee1115e-9951-48e7-89c0-b5c676716145\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.035147 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-ovsdbserver-sb\") pod \"fee1115e-9951-48e7-89c0-b5c676716145\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.036132 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-fernet-keys\") pod \"dcb997eb-469c-4436-9ac8-e814b6488263\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.073020 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee1115e-9951-48e7-89c0-b5c676716145-kube-api-access-rsxhk" (OuterVolumeSpecName: "kube-api-access-rsxhk") pod "fee1115e-9951-48e7-89c0-b5c676716145" (UID: "fee1115e-9951-48e7-89c0-b5c676716145"). InnerVolumeSpecName "kube-api-access-rsxhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.073247 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "dcb997eb-469c-4436-9ac8-e814b6488263" (UID: "dcb997eb-469c-4436-9ac8-e814b6488263"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.074449 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dcb997eb-469c-4436-9ac8-e814b6488263" (UID: "dcb997eb-469c-4436-9ac8-e814b6488263"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.076402 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fee1115e-9951-48e7-89c0-b5c676716145" (UID: "fee1115e-9951-48e7-89c0-b5c676716145"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.138258 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-combined-ca-bundle\") pod \"dcb997eb-469c-4436-9ac8-e814b6488263\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.138334 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-scripts\") pod \"dcb997eb-469c-4436-9ac8-e814b6488263\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.138400 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-ovsdbserver-nb\") pod \"fee1115e-9951-48e7-89c0-b5c676716145\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.138444 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2l54\" (UniqueName: \"kubernetes.io/projected/dcb997eb-469c-4436-9ac8-e814b6488263-kube-api-access-q2l54\") pod \"dcb997eb-469c-4436-9ac8-e814b6488263\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.138507 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-config\") pod \"fee1115e-9951-48e7-89c0-b5c676716145\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.138803 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-config-data\") pod \"dcb997eb-469c-4436-9ac8-e814b6488263\" (UID: \"dcb997eb-469c-4436-9ac8-e814b6488263\") " Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.138837 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-dns-svc\") pod \"fee1115e-9951-48e7-89c0-b5c676716145\" (UID: \"fee1115e-9951-48e7-89c0-b5c676716145\") " Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.139391 4913 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.139416 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsxhk\" (UniqueName: \"kubernetes.io/projected/fee1115e-9951-48e7-89c0-b5c676716145-kube-api-access-rsxhk\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.139430 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.139442 4913 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.141328 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcb997eb-469c-4436-9ac8-e814b6488263-kube-api-access-q2l54" (OuterVolumeSpecName: "kube-api-access-q2l54") pod "dcb997eb-469c-4436-9ac8-e814b6488263" (UID: "dcb997eb-469c-4436-9ac8-e814b6488263"). InnerVolumeSpecName "kube-api-access-q2l54". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.159403 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fee1115e-9951-48e7-89c0-b5c676716145" (UID: "fee1115e-9951-48e7-89c0-b5c676716145"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.160914 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-config" (OuterVolumeSpecName: "config") pod "fee1115e-9951-48e7-89c0-b5c676716145" (UID: "fee1115e-9951-48e7-89c0-b5c676716145"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.170863 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-scripts" (OuterVolumeSpecName: "scripts") pod "dcb997eb-469c-4436-9ac8-e814b6488263" (UID: "dcb997eb-469c-4436-9ac8-e814b6488263"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.174979 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fee1115e-9951-48e7-89c0-b5c676716145" (UID: "fee1115e-9951-48e7-89c0-b5c676716145"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.180086 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-config-data" (OuterVolumeSpecName: "config-data") pod "dcb997eb-469c-4436-9ac8-e814b6488263" (UID: "dcb997eb-469c-4436-9ac8-e814b6488263"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.180342 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcb997eb-469c-4436-9ac8-e814b6488263" (UID: "dcb997eb-469c-4436-9ac8-e814b6488263"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.240565 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.240614 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.240627 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.240639 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2l54\" (UniqueName: \"kubernetes.io/projected/dcb997eb-469c-4436-9ac8-e814b6488263-kube-api-access-q2l54\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.240652 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.240663 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb997eb-469c-4436-9ac8-e814b6488263-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.240674 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee1115e-9951-48e7-89c0-b5c676716145-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.812726 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.816000 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bcx4f" Oct 01 12:53:52 crc kubenswrapper[4913]: E1001 12:53:52.819183 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:bdfed2a176a064bf70082602a1f319eace2d9003ff1117b1e48b7f2130840070\\\"\"" pod="openstack/glance-db-sync-b6l67" podUID="57296118-560c-4764-b94a-472d8467f7c0" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.821260 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744bbc95cf-4zm5x" event={"ID":"fee1115e-9951-48e7-89c0-b5c676716145","Type":"ContainerDied","Data":"eeb869785d48b260d6fd91c4ce1b9b934c4ee23e70003612f05345d291537c2f"} Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.821311 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bcx4f" event={"ID":"dcb997eb-469c-4436-9ac8-e814b6488263","Type":"ContainerDied","Data":"96d9cfd048b364f67dd44ef4043e1c04e85c512b405c301003269468d1d54370"} Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.821323 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96d9cfd048b364f67dd44ef4043e1c04e85c512b405c301003269468d1d54370" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.821340 4913 scope.go:117] "RemoveContainer" containerID="c0c02d8eca4c42c7b3eabc674bcdad60a870fa1093e326cf2ecab4f5c7a50d6a" Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.881370 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744bbc95cf-4zm5x"] Oct 01 12:53:52 crc kubenswrapper[4913]: I1001 12:53:52.889034 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-744bbc95cf-4zm5x"] Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.091846 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bcx4f"] Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.100014 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bcx4f"] Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.185279 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kwlhs"] Oct 01 12:53:53 crc kubenswrapper[4913]: E1001 12:53:53.185715 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb997eb-469c-4436-9ac8-e814b6488263" containerName="keystone-bootstrap" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.185734 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb997eb-469c-4436-9ac8-e814b6488263" containerName="keystone-bootstrap" Oct 01 12:53:53 crc kubenswrapper[4913]: E1001 12:53:53.185754 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee1115e-9951-48e7-89c0-b5c676716145" containerName="init" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.185761 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee1115e-9951-48e7-89c0-b5c676716145" containerName="init" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.185918 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee1115e-9951-48e7-89c0-b5c676716145" containerName="init" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.185940 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcb997eb-469c-4436-9ac8-e814b6488263" containerName="keystone-bootstrap" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.186518 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.188630 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lbn2j" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.188797 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.188855 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.188897 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.191442 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kwlhs"] Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.360976 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-scripts\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.361055 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-credential-keys\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.361101 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdkk9\" (UniqueName: \"kubernetes.io/projected/872e2d84-7827-401f-bf95-60df7954e22e-kube-api-access-sdkk9\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.361130 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-config-data\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.361347 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-fernet-keys\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.361395 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-combined-ca-bundle\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.463214 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-fernet-keys\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.463256 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-combined-ca-bundle\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.463305 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-scripts\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.463355 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-credential-keys\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.463399 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdkk9\" (UniqueName: \"kubernetes.io/projected/872e2d84-7827-401f-bf95-60df7954e22e-kube-api-access-sdkk9\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.463424 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-config-data\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.468250 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-fernet-keys\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.470140 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-config-data\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.470409 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-combined-ca-bundle\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.474833 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-credential-keys\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.479587 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdkk9\" (UniqueName: \"kubernetes.io/projected/872e2d84-7827-401f-bf95-60df7954e22e-kube-api-access-sdkk9\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.481726 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-scripts\") pod \"keystone-bootstrap-kwlhs\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:53 crc kubenswrapper[4913]: I1001 12:53:53.543976 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:53:54 crc kubenswrapper[4913]: E1001 12:53:54.406864 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:32a25ac44706b73bff04a89514177b1efd675f0442b295e225f0020555ca6350" Oct 01 12:53:54 crc kubenswrapper[4913]: E1001 12:53:54.408100 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:32a25ac44706b73bff04a89514177b1efd675f0442b295e225f0020555ca6350,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n676h95h657h56bh54dh9bh5d7h669h95h575h695h685h5ch5b9h66dh55bh9dh675h588h675h577h674h588h67h698h56bh5d4h568h565h59fh84h5bq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6gmq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(e43a3fef-6dd6-4239-b6de-028dfa7145fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:53:54 crc kubenswrapper[4913]: I1001 12:53:54.830965 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcb997eb-469c-4436-9ac8-e814b6488263" path="/var/lib/kubelet/pods/dcb997eb-469c-4436-9ac8-e814b6488263/volumes" Oct 01 12:53:54 crc kubenswrapper[4913]: I1001 12:53:54.831898 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee1115e-9951-48e7-89c0-b5c676716145" path="/var/lib/kubelet/pods/fee1115e-9951-48e7-89c0-b5c676716145/volumes" Oct 01 12:54:01 crc kubenswrapper[4913]: I1001 12:54:01.897099 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" event={"ID":"60087e2e-0b0a-4a75-97be-912e06c0b17a","Type":"ContainerStarted","Data":"ba348fb80160b49f753b98eea15c0b9164598e4a361122db72555e1cf88b64d5"} Oct 01 12:54:01 crc kubenswrapper[4913]: I1001 12:54:01.897634 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:54:01 crc kubenswrapper[4913]: I1001 12:54:01.907761 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8r5fv" event={"ID":"b3950662-6b64-4585-8cb2-8c94623a3d66","Type":"ContainerStarted","Data":"abe58ab5ed439f472f92760a23b80bd669796d967e771e2cecc0636a0a3b62a8"} Oct 01 12:54:01 crc kubenswrapper[4913]: I1001 12:54:01.909589 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f465bcb97-7bzq4" event={"ID":"50a601b3-1f16-4504-bc7c-aa573c34764e","Type":"ContainerStarted","Data":"fbbb0bf19b5d08e8e2c6509bed9f36d1976dd36486be4781f3b2b6bb8b726623"} Oct 01 12:54:01 crc kubenswrapper[4913]: I1001 12:54:01.909757 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5f465bcb97-7bzq4" podUID="50a601b3-1f16-4504-bc7c-aa573c34764e" containerName="horizon-log" containerID="cri-o://fbbb0bf19b5d08e8e2c6509bed9f36d1976dd36486be4781f3b2b6bb8b726623" gracePeriod=30 Oct 01 12:54:01 crc kubenswrapper[4913]: I1001 12:54:01.909863 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5f465bcb97-7bzq4" podUID="50a601b3-1f16-4504-bc7c-aa573c34764e" containerName="horizon" containerID="cri-o://6cee38755edc4145b698c1dd532389282c8aaf423b9ffc8f989f4cbf59368f73" gracePeriod=30 Oct 01 12:54:01 crc kubenswrapper[4913]: I1001 12:54:01.912556 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7df6769d97-bz6l5" event={"ID":"80c570d0-d665-4680-a6e5-b4c7734a87af","Type":"ContainerStarted","Data":"af6f73bb43202e329df9438d340694a958048d36a7d533a6a92abb647e5e616f"} Oct 01 12:54:01 crc kubenswrapper[4913]: I1001 12:54:01.918046 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" podStartSLOduration=21.918019703 podStartE2EDuration="21.918019703s" podCreationTimestamp="2025-10-01 12:53:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:01.911311729 +0000 UTC m=+973.814787327" watchObservedRunningTime="2025-10-01 12:54:01.918019703 +0000 UTC m=+973.821495281" Oct 01 12:54:01 crc kubenswrapper[4913]: I1001 12:54:01.918912 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e43a3fef-6dd6-4239-b6de-028dfa7145fc","Type":"ContainerStarted","Data":"1ffea247c10bf8f562015e6f656838b869ab34d8fb3f784d4dd1674ae546a61a"} Oct 01 12:54:01 crc kubenswrapper[4913]: I1001 12:54:01.933330 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8r5fv" podStartSLOduration=1.9450997270000001 podStartE2EDuration="21.933313372s" podCreationTimestamp="2025-10-01 12:53:40 +0000 UTC" firstStartedPulling="2025-10-01 12:53:41.489975569 +0000 UTC m=+953.393451147" lastFinishedPulling="2025-10-01 12:54:01.478189214 +0000 UTC m=+973.381664792" observedRunningTime="2025-10-01 12:54:01.931587565 +0000 UTC m=+973.835063163" watchObservedRunningTime="2025-10-01 12:54:01.933313372 +0000 UTC m=+973.836788950" Oct 01 12:54:01 crc kubenswrapper[4913]: I1001 12:54:01.957352 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5f465bcb97-7bzq4" podStartSLOduration=2.463660058 podStartE2EDuration="22.95733366s" podCreationTimestamp="2025-10-01 12:53:39 +0000 UTC" firstStartedPulling="2025-10-01 12:53:40.988248005 +0000 UTC m=+952.891723583" lastFinishedPulling="2025-10-01 12:54:01.481921607 +0000 UTC m=+973.385397185" observedRunningTime="2025-10-01 12:54:01.951071258 +0000 UTC m=+973.854546846" watchObservedRunningTime="2025-10-01 12:54:01.95733366 +0000 UTC m=+973.860809238" Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.006403 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b6c9764d-c6wjv"] Oct 01 12:54:02 crc kubenswrapper[4913]: W1001 12:54:02.015536 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c044c3c_7de5_45ec_b450_cbfaf0ca415f.slice/crio-612a9c8c65730a9caf3e4e4c5859019c1a76881474e0a7a2f1b470e089b60c5b WatchSource:0}: Error finding container 612a9c8c65730a9caf3e4e4c5859019c1a76881474e0a7a2f1b470e089b60c5b: Status 404 returned error can't find the container with id 612a9c8c65730a9caf3e4e4c5859019c1a76881474e0a7a2f1b470e089b60c5b Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.034774 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c4466fb5f-6vzh5"] Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.370708 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-x9gmn"] Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.374830 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f41e-account-create-5hkwk"] Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.396413 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kwlhs"] Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.403457 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-969db9cf8-b2hmw"] Oct 01 12:54:02 crc kubenswrapper[4913]: W1001 12:54:02.416538 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f1ef1c4_7a72_4569_b21c_ef13cb766d25.slice/crio-125d90f0e82b07a0a7bb4c0b5c741f8357fe680f7b8269fe94f38423f4ce2229 WatchSource:0}: Error finding container 125d90f0e82b07a0a7bb4c0b5c741f8357fe680f7b8269fe94f38423f4ce2229: Status 404 returned error can't find the container with id 125d90f0e82b07a0a7bb4c0b5c741f8357fe680f7b8269fe94f38423f4ce2229 Oct 01 12:54:02 crc kubenswrapper[4913]: W1001 12:54:02.417150 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3523af18_f212_4eeb_8e62_5fabc32a4e6c.slice/crio-d245b468cc2043d51587c47d1f916abc8c3389e9679a27d1ccc5d5bb11f94d31 WatchSource:0}: Error finding container d245b468cc2043d51587c47d1f916abc8c3389e9679a27d1ccc5d5bb11f94d31: Status 404 returned error can't find the container with id d245b468cc2043d51587c47d1f916abc8c3389e9679a27d1ccc5d5bb11f94d31 Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.437654 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qzddd"] Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.443194 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 01 12:54:02 crc kubenswrapper[4913]: W1001 12:54:02.446773 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode67e15e0_4c9f_492c_b38c_7955b5830285.slice/crio-ad72b126958f3191f1830b66b7cfeb31d1fb5a239499f38faae32b17a092a591 WatchSource:0}: Error finding container ad72b126958f3191f1830b66b7cfeb31d1fb5a239499f38faae32b17a092a591: Status 404 returned error can't find the container with id ad72b126958f3191f1830b66b7cfeb31d1fb5a239499f38faae32b17a092a591 Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.941666 4913 generic.go:334] "Generic (PLEG): container finished" podID="3523af18-f212-4eeb-8e62-5fabc32a4e6c" containerID="da08a47649e49b8fb350534810eca2b16320ed6ac6621b509d9fc5bee0170851" exitCode=0 Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.941979 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f41e-account-create-5hkwk" event={"ID":"3523af18-f212-4eeb-8e62-5fabc32a4e6c","Type":"ContainerDied","Data":"da08a47649e49b8fb350534810eca2b16320ed6ac6621b509d9fc5bee0170851"} Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.942005 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f41e-account-create-5hkwk" event={"ID":"3523af18-f212-4eeb-8e62-5fabc32a4e6c","Type":"ContainerStarted","Data":"d245b468cc2043d51587c47d1f916abc8c3389e9679a27d1ccc5d5bb11f94d31"} Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.945455 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f465bcb97-7bzq4" event={"ID":"50a601b3-1f16-4504-bc7c-aa573c34764e","Type":"ContainerStarted","Data":"6cee38755edc4145b698c1dd532389282c8aaf423b9ffc8f989f4cbf59368f73"} Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.950477 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kwlhs" event={"ID":"872e2d84-7827-401f-bf95-60df7954e22e","Type":"ContainerStarted","Data":"0733b180aa15fc2fdb00801899e46a4bc69a580d010f4ca862fef2e338a3efbd"} Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.950514 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kwlhs" event={"ID":"872e2d84-7827-401f-bf95-60df7954e22e","Type":"ContainerStarted","Data":"1fcd8c2d810e6b089c4416987a33d9933fd57c637c648fb7cd823f06c01fbd98"} Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.962961 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7df6769d97-bz6l5" event={"ID":"80c570d0-d665-4680-a6e5-b4c7734a87af","Type":"ContainerStarted","Data":"c63efd07dbfbb67af7f524c5157345f80bdc3e9222146bf65e9bc5fc64c117c7"} Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.963082 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7df6769d97-bz6l5" podUID="80c570d0-d665-4680-a6e5-b4c7734a87af" containerName="horizon-log" containerID="cri-o://af6f73bb43202e329df9438d340694a958048d36a7d533a6a92abb647e5e616f" gracePeriod=30 Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.963263 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7df6769d97-bz6l5" podUID="80c570d0-d665-4680-a6e5-b4c7734a87af" containerName="horizon" containerID="cri-o://c63efd07dbfbb67af7f524c5157345f80bdc3e9222146bf65e9bc5fc64c117c7" gracePeriod=30 Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.975019 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c4466fb5f-6vzh5" event={"ID":"f58bd451-a408-4ec8-908e-255afe71b949","Type":"ContainerStarted","Data":"a39b8415de589fc7e47a35723f6b40f66128e99e48801a4dee0b5e52069ed0b8"} Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.975062 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c4466fb5f-6vzh5" event={"ID":"f58bd451-a408-4ec8-908e-255afe71b949","Type":"ContainerStarted","Data":"b267d813692ffffb75e7b38084d91ce1ec41224bd100ec447c3e0afb2af3bdb7"} Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.975072 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c4466fb5f-6vzh5" event={"ID":"f58bd451-a408-4ec8-908e-255afe71b949","Type":"ContainerStarted","Data":"c63a8e9db01eda3dd5b0994373a5712ebf7fb1e666144bba233b6d20e5c136cf"} Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.975168 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c4466fb5f-6vzh5" podUID="f58bd451-a408-4ec8-908e-255afe71b949" containerName="horizon-log" containerID="cri-o://b267d813692ffffb75e7b38084d91ce1ec41224bd100ec447c3e0afb2af3bdb7" gracePeriod=30 Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.975401 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c4466fb5f-6vzh5" podUID="f58bd451-a408-4ec8-908e-255afe71b949" containerName="horizon" containerID="cri-o://a39b8415de589fc7e47a35723f6b40f66128e99e48801a4dee0b5e52069ed0b8" gracePeriod=30 Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.977460 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x9gmn" event={"ID":"0f1ef1c4-7a72-4569-b21c-ef13cb766d25","Type":"ContainerStarted","Data":"125d90f0e82b07a0a7bb4c0b5c741f8357fe680f7b8269fe94f38423f4ce2229"} Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.981879 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kwlhs" podStartSLOduration=9.981865755 podStartE2EDuration="9.981865755s" podCreationTimestamp="2025-10-01 12:53:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:02.975973264 +0000 UTC m=+974.879448852" watchObservedRunningTime="2025-10-01 12:54:02.981865755 +0000 UTC m=+974.885341333" Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.983152 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b6c9764d-c6wjv" event={"ID":"3c044c3c-7de5-45ec-b450-cbfaf0ca415f","Type":"ContainerStarted","Data":"59ec9df9fd1874483d5b297dfb0b0925ec8363c07b37d09ae1782751e5d9fc37"} Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.983186 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b6c9764d-c6wjv" event={"ID":"3c044c3c-7de5-45ec-b450-cbfaf0ca415f","Type":"ContainerStarted","Data":"eb46af49a210dc34c36f5f2a0ca5ab57aaf2ea1027d59e67c8f38d9dc1d63a23"} Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.983196 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b6c9764d-c6wjv" event={"ID":"3c044c3c-7de5-45ec-b450-cbfaf0ca415f","Type":"ContainerStarted","Data":"612a9c8c65730a9caf3e4e4c5859019c1a76881474e0a7a2f1b470e089b60c5b"} Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.992826 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-969db9cf8-b2hmw" event={"ID":"e67e15e0-4c9f-492c-b38c-7955b5830285","Type":"ContainerStarted","Data":"3540cf598ef7e863a3713c1622a7bef04bac09f3fe7223b255010b7c67ddde8f"} Oct 01 12:54:02 crc kubenswrapper[4913]: I1001 12:54:02.992863 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-969db9cf8-b2hmw" event={"ID":"e67e15e0-4c9f-492c-b38c-7955b5830285","Type":"ContainerStarted","Data":"ad72b126958f3191f1830b66b7cfeb31d1fb5a239499f38faae32b17a092a591"} Oct 01 12:54:03 crc kubenswrapper[4913]: I1001 12:54:03.000200 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qzddd" event={"ID":"706c5fb0-a691-4f92-bb4e-a6ba720abfa1","Type":"ContainerStarted","Data":"272efc5b1b50c78ce94135ef75305fa56f423025c9cba7737c31de53cb023d8d"} Oct 01 12:54:03 crc kubenswrapper[4913]: I1001 12:54:03.039848 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5c4466fb5f-6vzh5" podStartSLOduration=22.039828893 podStartE2EDuration="22.039828893s" podCreationTimestamp="2025-10-01 12:53:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:03.030567239 +0000 UTC m=+974.934042827" watchObservedRunningTime="2025-10-01 12:54:03.039828893 +0000 UTC m=+974.943304471" Oct 01 12:54:03 crc kubenswrapper[4913]: I1001 12:54:03.043563 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7df6769d97-bz6l5" podStartSLOduration=3.125915283 podStartE2EDuration="23.043548845s" podCreationTimestamp="2025-10-01 12:53:40 +0000 UTC" firstStartedPulling="2025-10-01 12:53:41.602670536 +0000 UTC m=+953.506146114" lastFinishedPulling="2025-10-01 12:54:01.520304088 +0000 UTC m=+973.423779676" observedRunningTime="2025-10-01 12:54:03.002712787 +0000 UTC m=+974.906188385" watchObservedRunningTime="2025-10-01 12:54:03.043548845 +0000 UTC m=+974.947024423" Oct 01 12:54:03 crc kubenswrapper[4913]: I1001 12:54:03.067012 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b6c9764d-c6wjv" podStartSLOduration=14.066994877 podStartE2EDuration="14.066994877s" podCreationTimestamp="2025-10-01 12:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:03.061566149 +0000 UTC m=+974.965041747" watchObservedRunningTime="2025-10-01 12:54:03.066994877 +0000 UTC m=+974.970470445" Oct 01 12:54:04 crc kubenswrapper[4913]: I1001 12:54:04.019021 4913 generic.go:334] "Generic (PLEG): container finished" podID="b3950662-6b64-4585-8cb2-8c94623a3d66" containerID="abe58ab5ed439f472f92760a23b80bd669796d967e771e2cecc0636a0a3b62a8" exitCode=0 Oct 01 12:54:04 crc kubenswrapper[4913]: I1001 12:54:04.019083 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8r5fv" event={"ID":"b3950662-6b64-4585-8cb2-8c94623a3d66","Type":"ContainerDied","Data":"abe58ab5ed439f472f92760a23b80bd669796d967e771e2cecc0636a0a3b62a8"} Oct 01 12:54:04 crc kubenswrapper[4913]: I1001 12:54:04.023616 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-969db9cf8-b2hmw" event={"ID":"e67e15e0-4c9f-492c-b38c-7955b5830285","Type":"ContainerStarted","Data":"f9bb68382bb28b1b72d17866d30a7e39418b92c5011d864d619e116c0ab8648f"} Oct 01 12:54:04 crc kubenswrapper[4913]: I1001 12:54:04.065172 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-969db9cf8-b2hmw" podStartSLOduration=15.06515081 podStartE2EDuration="15.06515081s" podCreationTimestamp="2025-10-01 12:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:04.056487943 +0000 UTC m=+975.959963541" watchObservedRunningTime="2025-10-01 12:54:04.06515081 +0000 UTC m=+975.968626388" Oct 01 12:54:04 crc kubenswrapper[4913]: I1001 12:54:04.511078 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f41e-account-create-5hkwk" Oct 01 12:54:04 crc kubenswrapper[4913]: I1001 12:54:04.698176 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql75g\" (UniqueName: \"kubernetes.io/projected/3523af18-f212-4eeb-8e62-5fabc32a4e6c-kube-api-access-ql75g\") pod \"3523af18-f212-4eeb-8e62-5fabc32a4e6c\" (UID: \"3523af18-f212-4eeb-8e62-5fabc32a4e6c\") " Oct 01 12:54:04 crc kubenswrapper[4913]: I1001 12:54:04.704198 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3523af18-f212-4eeb-8e62-5fabc32a4e6c-kube-api-access-ql75g" (OuterVolumeSpecName: "kube-api-access-ql75g") pod "3523af18-f212-4eeb-8e62-5fabc32a4e6c" (UID: "3523af18-f212-4eeb-8e62-5fabc32a4e6c"). InnerVolumeSpecName "kube-api-access-ql75g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:04 crc kubenswrapper[4913]: I1001 12:54:04.801022 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql75g\" (UniqueName: \"kubernetes.io/projected/3523af18-f212-4eeb-8e62-5fabc32a4e6c-kube-api-access-ql75g\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.046243 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f41e-account-create-5hkwk" event={"ID":"3523af18-f212-4eeb-8e62-5fabc32a4e6c","Type":"ContainerDied","Data":"d245b468cc2043d51587c47d1f916abc8c3389e9679a27d1ccc5d5bb11f94d31"} Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.046459 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d245b468cc2043d51587c47d1f916abc8c3389e9679a27d1ccc5d5bb11f94d31" Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.046307 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f41e-account-create-5hkwk" Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.485323 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8r5fv" Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.614734 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3950662-6b64-4585-8cb2-8c94623a3d66-logs\") pod \"b3950662-6b64-4585-8cb2-8c94623a3d66\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.614793 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-config-data\") pod \"b3950662-6b64-4585-8cb2-8c94623a3d66\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.614959 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-scripts\") pod \"b3950662-6b64-4585-8cb2-8c94623a3d66\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.615021 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-combined-ca-bundle\") pod \"b3950662-6b64-4585-8cb2-8c94623a3d66\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.615044 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjzj6\" (UniqueName: \"kubernetes.io/projected/b3950662-6b64-4585-8cb2-8c94623a3d66-kube-api-access-pjzj6\") pod \"b3950662-6b64-4585-8cb2-8c94623a3d66\" (UID: \"b3950662-6b64-4585-8cb2-8c94623a3d66\") " Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.615172 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3950662-6b64-4585-8cb2-8c94623a3d66-logs" (OuterVolumeSpecName: "logs") pod "b3950662-6b64-4585-8cb2-8c94623a3d66" (UID: "b3950662-6b64-4585-8cb2-8c94623a3d66"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.615689 4913 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3950662-6b64-4585-8cb2-8c94623a3d66-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.620340 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-scripts" (OuterVolumeSpecName: "scripts") pod "b3950662-6b64-4585-8cb2-8c94623a3d66" (UID: "b3950662-6b64-4585-8cb2-8c94623a3d66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.645113 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3950662-6b64-4585-8cb2-8c94623a3d66-kube-api-access-pjzj6" (OuterVolumeSpecName: "kube-api-access-pjzj6") pod "b3950662-6b64-4585-8cb2-8c94623a3d66" (UID: "b3950662-6b64-4585-8cb2-8c94623a3d66"). InnerVolumeSpecName "kube-api-access-pjzj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.648079 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-config-data" (OuterVolumeSpecName: "config-data") pod "b3950662-6b64-4585-8cb2-8c94623a3d66" (UID: "b3950662-6b64-4585-8cb2-8c94623a3d66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.651309 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3950662-6b64-4585-8cb2-8c94623a3d66" (UID: "b3950662-6b64-4585-8cb2-8c94623a3d66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.718522 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.718589 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.718602 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjzj6\" (UniqueName: \"kubernetes.io/projected/b3950662-6b64-4585-8cb2-8c94623a3d66-kube-api-access-pjzj6\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:05 crc kubenswrapper[4913]: I1001 12:54:05.718613 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3950662-6b64-4585-8cb2-8c94623a3d66-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.072741 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8r5fv" event={"ID":"b3950662-6b64-4585-8cb2-8c94623a3d66","Type":"ContainerDied","Data":"885970f33c0db7405af2b604c9248730a9f669c7c4ac2077665cb1ac7ba071f5"} Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.072891 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="885970f33c0db7405af2b604c9248730a9f669c7c4ac2077665cb1ac7ba071f5" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.072967 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8r5fv" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.151755 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-66d867dfb6-r9zrq"] Oct 01 12:54:06 crc kubenswrapper[4913]: E1001 12:54:06.159123 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3523af18-f212-4eeb-8e62-5fabc32a4e6c" containerName="mariadb-account-create" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.159152 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3523af18-f212-4eeb-8e62-5fabc32a4e6c" containerName="mariadb-account-create" Oct 01 12:54:06 crc kubenswrapper[4913]: E1001 12:54:06.159167 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3950662-6b64-4585-8cb2-8c94623a3d66" containerName="placement-db-sync" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.159173 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3950662-6b64-4585-8cb2-8c94623a3d66" containerName="placement-db-sync" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.159477 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3950662-6b64-4585-8cb2-8c94623a3d66" containerName="placement-db-sync" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.159533 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="3523af18-f212-4eeb-8e62-5fabc32a4e6c" containerName="mariadb-account-create" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.160637 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.162372 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.162800 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.163488 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tfwq6" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.163935 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.164151 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.165878 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-66d867dfb6-r9zrq"] Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.329273 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-combined-ca-bundle\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.329329 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-public-tls-certs\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.329359 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11336389-1acf-4342-b478-e11f04e7848d-logs\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.329408 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9x7z\" (UniqueName: \"kubernetes.io/projected/11336389-1acf-4342-b478-e11f04e7848d-kube-api-access-c9x7z\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.329645 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-config-data\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.329793 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-internal-tls-certs\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.329831 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-scripts\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.431777 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-config-data\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.432110 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-internal-tls-certs\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.432228 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-scripts\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.432366 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-combined-ca-bundle\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.432458 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-public-tls-certs\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.432564 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11336389-1acf-4342-b478-e11f04e7848d-logs\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.432703 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9x7z\" (UniqueName: \"kubernetes.io/projected/11336389-1acf-4342-b478-e11f04e7848d-kube-api-access-c9x7z\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.437886 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-internal-tls-certs\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.438079 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-config-data\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.438610 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11336389-1acf-4342-b478-e11f04e7848d-logs\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.443764 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-combined-ca-bundle\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.448953 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-scripts\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.454353 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9x7z\" (UniqueName: \"kubernetes.io/projected/11336389-1acf-4342-b478-e11f04e7848d-kube-api-access-c9x7z\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.457079 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11336389-1acf-4342-b478-e11f04e7848d-public-tls-certs\") pod \"placement-66d867dfb6-r9zrq\" (UID: \"11336389-1acf-4342-b478-e11f04e7848d\") " pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:06 crc kubenswrapper[4913]: I1001 12:54:06.494853 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:07 crc kubenswrapper[4913]: I1001 12:54:07.021958 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-66d867dfb6-r9zrq"] Oct 01 12:54:07 crc kubenswrapper[4913]: I1001 12:54:07.085327 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66d867dfb6-r9zrq" event={"ID":"11336389-1acf-4342-b478-e11f04e7848d","Type":"ContainerStarted","Data":"2e65b56299ea67e5537e26a8427991d806fdc2fd6d9f1187d62f4290b16bf675"} Oct 01 12:54:08 crc kubenswrapper[4913]: I1001 12:54:08.827539 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-szknr"] Oct 01 12:54:08 crc kubenswrapper[4913]: I1001 12:54:08.832750 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-szknr" Oct 01 12:54:08 crc kubenswrapper[4913]: I1001 12:54:08.835413 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 12:54:08 crc kubenswrapper[4913]: I1001 12:54:08.835424 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k6bs9" Oct 01 12:54:08 crc kubenswrapper[4913]: I1001 12:54:08.848627 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-szknr"] Oct 01 12:54:08 crc kubenswrapper[4913]: I1001 12:54:08.865288 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 12:54:08 crc kubenswrapper[4913]: I1001 12:54:08.915732 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dfvx\" (UniqueName: \"kubernetes.io/projected/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-kube-api-access-8dfvx\") pod \"neutron-db-sync-szknr\" (UID: \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\") " pod="openstack/neutron-db-sync-szknr" Oct 01 12:54:08 crc kubenswrapper[4913]: I1001 12:54:08.916043 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-combined-ca-bundle\") pod \"neutron-db-sync-szknr\" (UID: \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\") " pod="openstack/neutron-db-sync-szknr" Oct 01 12:54:08 crc kubenswrapper[4913]: I1001 12:54:08.916152 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-config\") pod \"neutron-db-sync-szknr\" (UID: \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\") " pod="openstack/neutron-db-sync-szknr" Oct 01 12:54:09 crc kubenswrapper[4913]: I1001 12:54:09.026424 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dfvx\" (UniqueName: \"kubernetes.io/projected/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-kube-api-access-8dfvx\") pod \"neutron-db-sync-szknr\" (UID: \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\") " pod="openstack/neutron-db-sync-szknr" Oct 01 12:54:09 crc kubenswrapper[4913]: I1001 12:54:09.026470 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-combined-ca-bundle\") pod \"neutron-db-sync-szknr\" (UID: \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\") " pod="openstack/neutron-db-sync-szknr" Oct 01 12:54:09 crc kubenswrapper[4913]: I1001 12:54:09.026502 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-config\") pod \"neutron-db-sync-szknr\" (UID: \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\") " pod="openstack/neutron-db-sync-szknr" Oct 01 12:54:09 crc kubenswrapper[4913]: I1001 12:54:09.041635 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-combined-ca-bundle\") pod \"neutron-db-sync-szknr\" (UID: \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\") " pod="openstack/neutron-db-sync-szknr" Oct 01 12:54:09 crc kubenswrapper[4913]: I1001 12:54:09.045877 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-config\") pod \"neutron-db-sync-szknr\" (UID: \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\") " pod="openstack/neutron-db-sync-szknr" Oct 01 12:54:09 crc kubenswrapper[4913]: I1001 12:54:09.046896 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dfvx\" (UniqueName: \"kubernetes.io/projected/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-kube-api-access-8dfvx\") pod \"neutron-db-sync-szknr\" (UID: \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\") " pod="openstack/neutron-db-sync-szknr" Oct 01 12:54:09 crc kubenswrapper[4913]: I1001 12:54:09.173766 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-szknr" Oct 01 12:54:09 crc kubenswrapper[4913]: I1001 12:54:09.421681 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:54:09 crc kubenswrapper[4913]: I1001 12:54:09.422013 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:54:09 crc kubenswrapper[4913]: I1001 12:54:09.528622 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:54:09 crc kubenswrapper[4913]: I1001 12:54:09.528673 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:54:10 crc kubenswrapper[4913]: I1001 12:54:10.111557 4913 generic.go:334] "Generic (PLEG): container finished" podID="872e2d84-7827-401f-bf95-60df7954e22e" containerID="0733b180aa15fc2fdb00801899e46a4bc69a580d010f4ca862fef2e338a3efbd" exitCode=0 Oct 01 12:54:10 crc kubenswrapper[4913]: I1001 12:54:10.111647 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kwlhs" event={"ID":"872e2d84-7827-401f-bf95-60df7954e22e","Type":"ContainerDied","Data":"0733b180aa15fc2fdb00801899e46a4bc69a580d010f4ca862fef2e338a3efbd"} Oct 01 12:54:10 crc kubenswrapper[4913]: I1001 12:54:10.303587 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:54:10 crc kubenswrapper[4913]: I1001 12:54:10.592731 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:54:10 crc kubenswrapper[4913]: I1001 12:54:10.623422 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:54:10 crc kubenswrapper[4913]: I1001 12:54:10.688006 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-8fr2n"] Oct 01 12:54:10 crc kubenswrapper[4913]: I1001 12:54:10.688289 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" podUID="3f29a2a3-85a3-40a0-ab42-1e575dea129c" containerName="dnsmasq-dns" containerID="cri-o://ba1215be9799425e72eaaeaa1a675a103ec50f80a68c1514503c9a71e2260e8c" gracePeriod=10 Oct 01 12:54:11 crc kubenswrapper[4913]: I1001 12:54:11.019092 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" podUID="3f29a2a3-85a3-40a0-ab42-1e575dea129c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Oct 01 12:54:11 crc kubenswrapper[4913]: I1001 12:54:11.122005 4913 generic.go:334] "Generic (PLEG): container finished" podID="3f29a2a3-85a3-40a0-ab42-1e575dea129c" containerID="ba1215be9799425e72eaaeaa1a675a103ec50f80a68c1514503c9a71e2260e8c" exitCode=0 Oct 01 12:54:11 crc kubenswrapper[4913]: I1001 12:54:11.122179 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" event={"ID":"3f29a2a3-85a3-40a0-ab42-1e575dea129c","Type":"ContainerDied","Data":"ba1215be9799425e72eaaeaa1a675a103ec50f80a68c1514503c9a71e2260e8c"} Oct 01 12:54:12 crc kubenswrapper[4913]: I1001 12:54:12.332161 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:54:16 crc kubenswrapper[4913]: I1001 12:54:16.019048 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" podUID="3f29a2a3-85a3-40a0-ab42-1e575dea129c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.198006 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kwlhs" event={"ID":"872e2d84-7827-401f-bf95-60df7954e22e","Type":"ContainerDied","Data":"1fcd8c2d810e6b089c4416987a33d9933fd57c637c648fb7cd823f06c01fbd98"} Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.198375 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fcd8c2d810e6b089c4416987a33d9933fd57c637c648fb7cd823f06c01fbd98" Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.294826 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.427836 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-fernet-keys\") pod \"872e2d84-7827-401f-bf95-60df7954e22e\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.428075 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdkk9\" (UniqueName: \"kubernetes.io/projected/872e2d84-7827-401f-bf95-60df7954e22e-kube-api-access-sdkk9\") pod \"872e2d84-7827-401f-bf95-60df7954e22e\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.428117 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-config-data\") pod \"872e2d84-7827-401f-bf95-60df7954e22e\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.428160 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-scripts\") pod \"872e2d84-7827-401f-bf95-60df7954e22e\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.428191 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-combined-ca-bundle\") pod \"872e2d84-7827-401f-bf95-60df7954e22e\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.428299 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-credential-keys\") pod \"872e2d84-7827-401f-bf95-60df7954e22e\" (UID: \"872e2d84-7827-401f-bf95-60df7954e22e\") " Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.442475 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/872e2d84-7827-401f-bf95-60df7954e22e-kube-api-access-sdkk9" (OuterVolumeSpecName: "kube-api-access-sdkk9") pod "872e2d84-7827-401f-bf95-60df7954e22e" (UID: "872e2d84-7827-401f-bf95-60df7954e22e"). InnerVolumeSpecName "kube-api-access-sdkk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.443228 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "872e2d84-7827-401f-bf95-60df7954e22e" (UID: "872e2d84-7827-401f-bf95-60df7954e22e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.444842 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "872e2d84-7827-401f-bf95-60df7954e22e" (UID: "872e2d84-7827-401f-bf95-60df7954e22e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.450800 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-config-data" (OuterVolumeSpecName: "config-data") pod "872e2d84-7827-401f-bf95-60df7954e22e" (UID: "872e2d84-7827-401f-bf95-60df7954e22e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.455397 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-scripts" (OuterVolumeSpecName: "scripts") pod "872e2d84-7827-401f-bf95-60df7954e22e" (UID: "872e2d84-7827-401f-bf95-60df7954e22e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.456480 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "872e2d84-7827-401f-bf95-60df7954e22e" (UID: "872e2d84-7827-401f-bf95-60df7954e22e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.530830 4913 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.530860 4913 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.530870 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdkk9\" (UniqueName: \"kubernetes.io/projected/872e2d84-7827-401f-bf95-60df7954e22e-kube-api-access-sdkk9\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.530880 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.530888 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:18 crc kubenswrapper[4913]: I1001 12:54:18.530896 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872e2d84-7827-401f-bf95-60df7954e22e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.206325 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kwlhs" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.400062 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7d8f89cd7f-sqjg8"] Oct 01 12:54:19 crc kubenswrapper[4913]: E1001 12:54:19.403116 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872e2d84-7827-401f-bf95-60df7954e22e" containerName="keystone-bootstrap" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.403175 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="872e2d84-7827-401f-bf95-60df7954e22e" containerName="keystone-bootstrap" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.403590 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="872e2d84-7827-401f-bf95-60df7954e22e" containerName="keystone-bootstrap" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.404996 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.407220 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lbn2j" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.407745 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.422107 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.422404 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.422511 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.422606 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.424882 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b6c9764d-c6wjv" podUID="3c044c3c-7de5-45ec-b450-cbfaf0ca415f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.140:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.140:8443: connect: connection refused" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.430041 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d8f89cd7f-sqjg8"] Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.530969 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-969db9cf8-b2hmw" podUID="e67e15e0-4c9f-492c-b38c-7955b5830285" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.141:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.141:8443: connect: connection refused" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.546222 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-config-data\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.546304 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-public-tls-certs\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.546333 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-fernet-keys\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.546599 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-combined-ca-bundle\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.546900 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-credential-keys\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.546974 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-scripts\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.547030 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6kmn\" (UniqueName: \"kubernetes.io/projected/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-kube-api-access-j6kmn\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.547099 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-internal-tls-certs\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.648551 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-combined-ca-bundle\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.648627 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-credential-keys\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.648654 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-scripts\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.648673 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6kmn\" (UniqueName: \"kubernetes.io/projected/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-kube-api-access-j6kmn\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.648693 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-internal-tls-certs\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.648733 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-config-data\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.648759 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-public-tls-certs\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.648775 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-fernet-keys\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.655539 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-config-data\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.656073 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-combined-ca-bundle\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.659927 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-internal-tls-certs\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.665812 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-credential-keys\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.666160 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-fernet-keys\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.669713 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-public-tls-certs\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.684638 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-scripts\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.694875 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6kmn\" (UniqueName: \"kubernetes.io/projected/a29fe08d-d79a-48ba-b8b8-67eda446e3c6-kube-api-access-j6kmn\") pod \"keystone-7d8f89cd7f-sqjg8\" (UID: \"a29fe08d-d79a-48ba-b8b8-67eda446e3c6\") " pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:19 crc kubenswrapper[4913]: I1001 12:54:19.732813 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:20 crc kubenswrapper[4913]: E1001 12:54:20.878279 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1" Oct 01 12:54:20 crc kubenswrapper[4913]: E1001 12:54:20.878714 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6gmq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(e43a3fef-6dd6-4239-b6de-028dfa7145fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:54:23 crc kubenswrapper[4913]: I1001 12:54:23.736767 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:54:23 crc kubenswrapper[4913]: I1001 12:54:23.921950 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-ovsdbserver-sb\") pod \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " Oct 01 12:54:23 crc kubenswrapper[4913]: I1001 12:54:23.922028 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-config\") pod \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " Oct 01 12:54:23 crc kubenswrapper[4913]: I1001 12:54:23.922147 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kdsd\" (UniqueName: \"kubernetes.io/projected/3f29a2a3-85a3-40a0-ab42-1e575dea129c-kube-api-access-5kdsd\") pod \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " Oct 01 12:54:23 crc kubenswrapper[4913]: I1001 12:54:23.922217 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-dns-svc\") pod \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " Oct 01 12:54:23 crc kubenswrapper[4913]: I1001 12:54:23.922318 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-ovsdbserver-nb\") pod \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\" (UID: \"3f29a2a3-85a3-40a0-ab42-1e575dea129c\") " Oct 01 12:54:23 crc kubenswrapper[4913]: I1001 12:54:23.931428 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f29a2a3-85a3-40a0-ab42-1e575dea129c-kube-api-access-5kdsd" (OuterVolumeSpecName: "kube-api-access-5kdsd") pod "3f29a2a3-85a3-40a0-ab42-1e575dea129c" (UID: "3f29a2a3-85a3-40a0-ab42-1e575dea129c"). InnerVolumeSpecName "kube-api-access-5kdsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:23 crc kubenswrapper[4913]: I1001 12:54:23.988158 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3f29a2a3-85a3-40a0-ab42-1e575dea129c" (UID: "3f29a2a3-85a3-40a0-ab42-1e575dea129c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:23 crc kubenswrapper[4913]: I1001 12:54:23.994698 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-config" (OuterVolumeSpecName: "config") pod "3f29a2a3-85a3-40a0-ab42-1e575dea129c" (UID: "3f29a2a3-85a3-40a0-ab42-1e575dea129c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:24 crc kubenswrapper[4913]: I1001 12:54:24.024439 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:24 crc kubenswrapper[4913]: I1001 12:54:24.024469 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:24 crc kubenswrapper[4913]: I1001 12:54:24.024482 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kdsd\" (UniqueName: \"kubernetes.io/projected/3f29a2a3-85a3-40a0-ab42-1e575dea129c-kube-api-access-5kdsd\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:24 crc kubenswrapper[4913]: I1001 12:54:24.031528 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3f29a2a3-85a3-40a0-ab42-1e575dea129c" (UID: "3f29a2a3-85a3-40a0-ab42-1e575dea129c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:24 crc kubenswrapper[4913]: I1001 12:54:24.042683 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3f29a2a3-85a3-40a0-ab42-1e575dea129c" (UID: "3f29a2a3-85a3-40a0-ab42-1e575dea129c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:24 crc kubenswrapper[4913]: I1001 12:54:24.125816 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:24 crc kubenswrapper[4913]: I1001 12:54:24.125852 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f29a2a3-85a3-40a0-ab42-1e575dea129c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:24 crc kubenswrapper[4913]: I1001 12:54:24.250880 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" event={"ID":"3f29a2a3-85a3-40a0-ab42-1e575dea129c","Type":"ContainerDied","Data":"24876a6dad316d09d49e8c1251d537c6db18e24ce52e374c635f2d604950d138"} Oct 01 12:54:24 crc kubenswrapper[4913]: I1001 12:54:24.250934 4913 scope.go:117] "RemoveContainer" containerID="ba1215be9799425e72eaaeaa1a675a103ec50f80a68c1514503c9a71e2260e8c" Oct 01 12:54:24 crc kubenswrapper[4913]: I1001 12:54:24.251063 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" Oct 01 12:54:24 crc kubenswrapper[4913]: I1001 12:54:24.292423 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-8fr2n"] Oct 01 12:54:24 crc kubenswrapper[4913]: I1001 12:54:24.307480 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-8fr2n"] Oct 01 12:54:24 crc kubenswrapper[4913]: I1001 12:54:24.820487 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f29a2a3-85a3-40a0-ab42-1e575dea129c" path="/var/lib/kubelet/pods/3f29a2a3-85a3-40a0-ab42-1e575dea129c/volumes" Oct 01 12:54:26 crc kubenswrapper[4913]: I1001 12:54:26.019101 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5d86d68bf7-8fr2n" podUID="3f29a2a3-85a3-40a0-ab42-1e575dea129c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: i/o timeout" Oct 01 12:54:31 crc kubenswrapper[4913]: I1001 12:54:31.265059 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:54:32 crc kubenswrapper[4913]: E1001 12:54:32.531417 4913 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50a601b3_1f16_4504_bc7c_aa573c34764e.slice/crio-conmon-6cee38755edc4145b698c1dd532389282c8aaf423b9ffc8f989f4cbf59368f73.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50a601b3_1f16_4504_bc7c_aa573c34764e.slice/crio-conmon-fbbb0bf19b5d08e8e2c6509bed9f36d1976dd36486be4781f3b2b6bb8b726623.scope\": RecentStats: unable to find data in memory cache]" Oct 01 12:54:32 crc kubenswrapper[4913]: I1001 12:54:32.964084 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-969db9cf8-b2hmw" Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.057861 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b6c9764d-c6wjv"] Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.058329 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b6c9764d-c6wjv" podUID="3c044c3c-7de5-45ec-b450-cbfaf0ca415f" containerName="horizon" containerID="cri-o://59ec9df9fd1874483d5b297dfb0b0925ec8363c07b37d09ae1782751e5d9fc37" gracePeriod=30 Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.060198 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b6c9764d-c6wjv" podUID="3c044c3c-7de5-45ec-b450-cbfaf0ca415f" containerName="horizon-log" containerID="cri-o://eb46af49a210dc34c36f5f2a0ca5ab57aaf2ea1027d59e67c8f38d9dc1d63a23" gracePeriod=30 Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.064580 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b6c9764d-c6wjv" podUID="3c044c3c-7de5-45ec-b450-cbfaf0ca415f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.140:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.327812 4913 generic.go:334] "Generic (PLEG): container finished" podID="f58bd451-a408-4ec8-908e-255afe71b949" containerID="a39b8415de589fc7e47a35723f6b40f66128e99e48801a4dee0b5e52069ed0b8" exitCode=137 Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.328183 4913 generic.go:334] "Generic (PLEG): container finished" podID="f58bd451-a408-4ec8-908e-255afe71b949" containerID="b267d813692ffffb75e7b38084d91ce1ec41224bd100ec447c3e0afb2af3bdb7" exitCode=137 Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.327975 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c4466fb5f-6vzh5" event={"ID":"f58bd451-a408-4ec8-908e-255afe71b949","Type":"ContainerDied","Data":"a39b8415de589fc7e47a35723f6b40f66128e99e48801a4dee0b5e52069ed0b8"} Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.328259 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c4466fb5f-6vzh5" event={"ID":"f58bd451-a408-4ec8-908e-255afe71b949","Type":"ContainerDied","Data":"b267d813692ffffb75e7b38084d91ce1ec41224bd100ec447c3e0afb2af3bdb7"} Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.330247 4913 generic.go:334] "Generic (PLEG): container finished" podID="50a601b3-1f16-4504-bc7c-aa573c34764e" containerID="6cee38755edc4145b698c1dd532389282c8aaf423b9ffc8f989f4cbf59368f73" exitCode=137 Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.330293 4913 generic.go:334] "Generic (PLEG): container finished" podID="50a601b3-1f16-4504-bc7c-aa573c34764e" containerID="fbbb0bf19b5d08e8e2c6509bed9f36d1976dd36486be4781f3b2b6bb8b726623" exitCode=137 Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.330305 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f465bcb97-7bzq4" event={"ID":"50a601b3-1f16-4504-bc7c-aa573c34764e","Type":"ContainerDied","Data":"6cee38755edc4145b698c1dd532389282c8aaf423b9ffc8f989f4cbf59368f73"} Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.330329 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f465bcb97-7bzq4" event={"ID":"50a601b3-1f16-4504-bc7c-aa573c34764e","Type":"ContainerDied","Data":"fbbb0bf19b5d08e8e2c6509bed9f36d1976dd36486be4781f3b2b6bb8b726623"} Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.332101 4913 generic.go:334] "Generic (PLEG): container finished" podID="80c570d0-d665-4680-a6e5-b4c7734a87af" containerID="c63efd07dbfbb67af7f524c5157345f80bdc3e9222146bf65e9bc5fc64c117c7" exitCode=137 Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.332120 4913 generic.go:334] "Generic (PLEG): container finished" podID="80c570d0-d665-4680-a6e5-b4c7734a87af" containerID="af6f73bb43202e329df9438d340694a958048d36a7d533a6a92abb647e5e616f" exitCode=137 Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.332142 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7df6769d97-bz6l5" event={"ID":"80c570d0-d665-4680-a6e5-b4c7734a87af","Type":"ContainerDied","Data":"c63efd07dbfbb67af7f524c5157345f80bdc3e9222146bf65e9bc5fc64c117c7"} Oct 01 12:54:33 crc kubenswrapper[4913]: I1001 12:54:33.332168 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7df6769d97-bz6l5" event={"ID":"80c570d0-d665-4680-a6e5-b4c7734a87af","Type":"ContainerDied","Data":"af6f73bb43202e329df9438d340694a958048d36a7d533a6a92abb647e5e616f"} Oct 01 12:54:34 crc kubenswrapper[4913]: E1001 12:54:34.728771 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:73fd28af83ea96cc920d26dba6105ee59f0824234527949884e6ca55b71d7533" Oct 01 12:54:34 crc kubenswrapper[4913]: E1001 12:54:34.728930 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:73fd28af83ea96cc920d26dba6105ee59f0824234527949884e6ca55b71d7533,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wfztb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-x9gmn_openstack(0f1ef1c4-7a72-4569-b21c-ef13cb766d25): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:54:34 crc kubenswrapper[4913]: E1001 12:54:34.730573 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-x9gmn" podUID="0f1ef1c4-7a72-4569-b21c-ef13cb766d25" Oct 01 12:54:35 crc kubenswrapper[4913]: I1001 12:54:35.169713 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-szknr"] Oct 01 12:54:35 crc kubenswrapper[4913]: E1001 12:54:35.351105 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:73fd28af83ea96cc920d26dba6105ee59f0824234527949884e6ca55b71d7533\\\"\"" pod="openstack/barbican-db-sync-x9gmn" podUID="0f1ef1c4-7a72-4569-b21c-ef13cb766d25" Oct 01 12:54:35 crc kubenswrapper[4913]: E1001 12:54:35.951775 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:e318869f706836a0c74c0ad55aab277b1bb7fae0555ae0f03cb28b379b9ce695" Oct 01 12:54:35 crc kubenswrapper[4913]: E1001 12:54:35.951977 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:e318869f706836a0c74c0ad55aab277b1bb7fae0555ae0f03cb28b379b9ce695,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-swkz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-qzddd_openstack(706c5fb0-a691-4f92-bb4e-a6ba720abfa1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:54:35 crc kubenswrapper[4913]: E1001 12:54:35.953149 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-qzddd" podUID="706c5fb0-a691-4f92-bb4e-a6ba720abfa1" Oct 01 12:54:36 crc kubenswrapper[4913]: E1001 12:54:36.365359 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:e318869f706836a0c74c0ad55aab277b1bb7fae0555ae0f03cb28b379b9ce695\\\"\"" pod="openstack/cinder-db-sync-qzddd" podUID="706c5fb0-a691-4f92-bb4e-a6ba720abfa1" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.465541 4913 scope.go:117] "RemoveContainer" containerID="d29fabf35383083e92e42434b0257736d224722a8cf16a02142b5a468ce773f1" Oct 01 12:54:38 crc kubenswrapper[4913]: W1001 12:54:38.493779 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b038340_cef3_419a_a1e2_2aa46a7f3ee6.slice/crio-7e2b3ddf726fa717a7f0ee62fd4b72b0915ad2ea932afbdac0243d4fa82c1b95 WatchSource:0}: Error finding container 7e2b3ddf726fa717a7f0ee62fd4b72b0915ad2ea932afbdac0243d4fa82c1b95: Status 404 returned error can't find the container with id 7e2b3ddf726fa717a7f0ee62fd4b72b0915ad2ea932afbdac0243d4fa82c1b95 Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.607527 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.618819 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.630589 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.788348 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50a601b3-1f16-4504-bc7c-aa573c34764e-config-data\") pod \"50a601b3-1f16-4504-bc7c-aa573c34764e\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.788728 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngmtl\" (UniqueName: \"kubernetes.io/projected/50a601b3-1f16-4504-bc7c-aa573c34764e-kube-api-access-ngmtl\") pod \"50a601b3-1f16-4504-bc7c-aa573c34764e\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.788806 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50a601b3-1f16-4504-bc7c-aa573c34764e-scripts\") pod \"50a601b3-1f16-4504-bc7c-aa573c34764e\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.789592 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80c570d0-d665-4680-a6e5-b4c7734a87af-config-data\") pod \"80c570d0-d665-4680-a6e5-b4c7734a87af\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.789658 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx98l\" (UniqueName: \"kubernetes.io/projected/f58bd451-a408-4ec8-908e-255afe71b949-kube-api-access-hx98l\") pod \"f58bd451-a408-4ec8-908e-255afe71b949\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.789684 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58bd451-a408-4ec8-908e-255afe71b949-scripts\") pod \"f58bd451-a408-4ec8-908e-255afe71b949\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.789723 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80c570d0-d665-4680-a6e5-b4c7734a87af-logs\") pod \"80c570d0-d665-4680-a6e5-b4c7734a87af\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.789783 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p6f9\" (UniqueName: \"kubernetes.io/projected/80c570d0-d665-4680-a6e5-b4c7734a87af-kube-api-access-8p6f9\") pod \"80c570d0-d665-4680-a6e5-b4c7734a87af\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.789812 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/50a601b3-1f16-4504-bc7c-aa573c34764e-horizon-secret-key\") pod \"50a601b3-1f16-4504-bc7c-aa573c34764e\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.789848 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f58bd451-a408-4ec8-908e-255afe71b949-config-data\") pod \"f58bd451-a408-4ec8-908e-255afe71b949\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.789868 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80c570d0-d665-4680-a6e5-b4c7734a87af-horizon-secret-key\") pod \"80c570d0-d665-4680-a6e5-b4c7734a87af\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.789899 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50a601b3-1f16-4504-bc7c-aa573c34764e-logs\") pod \"50a601b3-1f16-4504-bc7c-aa573c34764e\" (UID: \"50a601b3-1f16-4504-bc7c-aa573c34764e\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.789933 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80c570d0-d665-4680-a6e5-b4c7734a87af-scripts\") pod \"80c570d0-d665-4680-a6e5-b4c7734a87af\" (UID: \"80c570d0-d665-4680-a6e5-b4c7734a87af\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.790006 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f58bd451-a408-4ec8-908e-255afe71b949-horizon-secret-key\") pod \"f58bd451-a408-4ec8-908e-255afe71b949\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.790031 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f58bd451-a408-4ec8-908e-255afe71b949-logs\") pod \"f58bd451-a408-4ec8-908e-255afe71b949\" (UID: \"f58bd451-a408-4ec8-908e-255afe71b949\") " Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.791291 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a601b3-1f16-4504-bc7c-aa573c34764e-logs" (OuterVolumeSpecName: "logs") pod "50a601b3-1f16-4504-bc7c-aa573c34764e" (UID: "50a601b3-1f16-4504-bc7c-aa573c34764e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.791694 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f58bd451-a408-4ec8-908e-255afe71b949-logs" (OuterVolumeSpecName: "logs") pod "f58bd451-a408-4ec8-908e-255afe71b949" (UID: "f58bd451-a408-4ec8-908e-255afe71b949"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.792979 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80c570d0-d665-4680-a6e5-b4c7734a87af-logs" (OuterVolumeSpecName: "logs") pod "80c570d0-d665-4680-a6e5-b4c7734a87af" (UID: "80c570d0-d665-4680-a6e5-b4c7734a87af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.795290 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a601b3-1f16-4504-bc7c-aa573c34764e-kube-api-access-ngmtl" (OuterVolumeSpecName: "kube-api-access-ngmtl") pod "50a601b3-1f16-4504-bc7c-aa573c34764e" (UID: "50a601b3-1f16-4504-bc7c-aa573c34764e"). InnerVolumeSpecName "kube-api-access-ngmtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.796089 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50a601b3-1f16-4504-bc7c-aa573c34764e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "50a601b3-1f16-4504-bc7c-aa573c34764e" (UID: "50a601b3-1f16-4504-bc7c-aa573c34764e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.796388 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58bd451-a408-4ec8-908e-255afe71b949-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f58bd451-a408-4ec8-908e-255afe71b949" (UID: "f58bd451-a408-4ec8-908e-255afe71b949"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.802545 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80c570d0-d665-4680-a6e5-b4c7734a87af-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "80c570d0-d665-4680-a6e5-b4c7734a87af" (UID: "80c570d0-d665-4680-a6e5-b4c7734a87af"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.805777 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80c570d0-d665-4680-a6e5-b4c7734a87af-kube-api-access-8p6f9" (OuterVolumeSpecName: "kube-api-access-8p6f9") pod "80c570d0-d665-4680-a6e5-b4c7734a87af" (UID: "80c570d0-d665-4680-a6e5-b4c7734a87af"). InnerVolumeSpecName "kube-api-access-8p6f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.822381 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80c570d0-d665-4680-a6e5-b4c7734a87af-config-data" (OuterVolumeSpecName: "config-data") pod "80c570d0-d665-4680-a6e5-b4c7734a87af" (UID: "80c570d0-d665-4680-a6e5-b4c7734a87af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.823438 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f58bd451-a408-4ec8-908e-255afe71b949-kube-api-access-hx98l" (OuterVolumeSpecName: "kube-api-access-hx98l") pod "f58bd451-a408-4ec8-908e-255afe71b949" (UID: "f58bd451-a408-4ec8-908e-255afe71b949"). InnerVolumeSpecName "kube-api-access-hx98l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.823673 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80c570d0-d665-4680-a6e5-b4c7734a87af-scripts" (OuterVolumeSpecName: "scripts") pod "80c570d0-d665-4680-a6e5-b4c7734a87af" (UID: "80c570d0-d665-4680-a6e5-b4c7734a87af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.829876 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f58bd451-a408-4ec8-908e-255afe71b949-config-data" (OuterVolumeSpecName: "config-data") pod "f58bd451-a408-4ec8-908e-255afe71b949" (UID: "f58bd451-a408-4ec8-908e-255afe71b949"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.831326 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50a601b3-1f16-4504-bc7c-aa573c34764e-config-data" (OuterVolumeSpecName: "config-data") pod "50a601b3-1f16-4504-bc7c-aa573c34764e" (UID: "50a601b3-1f16-4504-bc7c-aa573c34764e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.849108 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50a601b3-1f16-4504-bc7c-aa573c34764e-scripts" (OuterVolumeSpecName: "scripts") pod "50a601b3-1f16-4504-bc7c-aa573c34764e" (UID: "50a601b3-1f16-4504-bc7c-aa573c34764e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.870111 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f58bd451-a408-4ec8-908e-255afe71b949-scripts" (OuterVolumeSpecName: "scripts") pod "f58bd451-a408-4ec8-908e-255afe71b949" (UID: "f58bd451-a408-4ec8-908e-255afe71b949"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892227 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngmtl\" (UniqueName: \"kubernetes.io/projected/50a601b3-1f16-4504-bc7c-aa573c34764e-kube-api-access-ngmtl\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892280 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50a601b3-1f16-4504-bc7c-aa573c34764e-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892291 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/80c570d0-d665-4680-a6e5-b4c7734a87af-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892300 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx98l\" (UniqueName: \"kubernetes.io/projected/f58bd451-a408-4ec8-908e-255afe71b949-kube-api-access-hx98l\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892309 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f58bd451-a408-4ec8-908e-255afe71b949-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892318 4913 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80c570d0-d665-4680-a6e5-b4c7734a87af-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892326 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p6f9\" (UniqueName: \"kubernetes.io/projected/80c570d0-d665-4680-a6e5-b4c7734a87af-kube-api-access-8p6f9\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892335 4913 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/50a601b3-1f16-4504-bc7c-aa573c34764e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892343 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f58bd451-a408-4ec8-908e-255afe71b949-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892350 4913 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/80c570d0-d665-4680-a6e5-b4c7734a87af-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892358 4913 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50a601b3-1f16-4504-bc7c-aa573c34764e-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892366 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80c570d0-d665-4680-a6e5-b4c7734a87af-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892373 4913 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f58bd451-a408-4ec8-908e-255afe71b949-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892381 4913 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f58bd451-a408-4ec8-908e-255afe71b949-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.892389 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50a601b3-1f16-4504-bc7c-aa573c34764e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4913]: I1001 12:54:38.963670 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d8f89cd7f-sqjg8"] Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.390614 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7df6769d97-bz6l5" event={"ID":"80c570d0-d665-4680-a6e5-b4c7734a87af","Type":"ContainerDied","Data":"86e4321f2011037ff95bf4661a73063795966fcb84490fac33e04314daf4e951"} Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.390629 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7df6769d97-bz6l5" Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.397358 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c4466fb5f-6vzh5" event={"ID":"f58bd451-a408-4ec8-908e-255afe71b949","Type":"ContainerDied","Data":"c63a8e9db01eda3dd5b0994373a5712ebf7fb1e666144bba233b6d20e5c136cf"} Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.397440 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c4466fb5f-6vzh5" Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.409639 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7df6769d97-bz6l5"] Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.412998 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-szknr" event={"ID":"3b038340-cef3-419a-a1e2-2aa46a7f3ee6","Type":"ContainerStarted","Data":"7e2b3ddf726fa717a7f0ee62fd4b72b0915ad2ea932afbdac0243d4fa82c1b95"} Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.421379 4913 generic.go:334] "Generic (PLEG): container finished" podID="3c044c3c-7de5-45ec-b450-cbfaf0ca415f" containerID="59ec9df9fd1874483d5b297dfb0b0925ec8363c07b37d09ae1782751e5d9fc37" exitCode=0 Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.421451 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b6c9764d-c6wjv" event={"ID":"3c044c3c-7de5-45ec-b450-cbfaf0ca415f","Type":"ContainerDied","Data":"59ec9df9fd1874483d5b297dfb0b0925ec8363c07b37d09ae1782751e5d9fc37"} Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.423987 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7df6769d97-bz6l5"] Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.426097 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f465bcb97-7bzq4" event={"ID":"50a601b3-1f16-4504-bc7c-aa573c34764e","Type":"ContainerDied","Data":"9d3a8d277c00bec18bb47349d8fdb2fcd336a437998d17676844375df249d9d6"} Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.426175 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f465bcb97-7bzq4" Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.432057 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c4466fb5f-6vzh5"] Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.438833 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5c4466fb5f-6vzh5"] Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.447221 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f465bcb97-7bzq4"] Oct 01 12:54:39 crc kubenswrapper[4913]: I1001 12:54:39.453841 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5f465bcb97-7bzq4"] Oct 01 12:54:40 crc kubenswrapper[4913]: I1001 12:54:40.823768 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a601b3-1f16-4504-bc7c-aa573c34764e" path="/var/lib/kubelet/pods/50a601b3-1f16-4504-bc7c-aa573c34764e/volumes" Oct 01 12:54:40 crc kubenswrapper[4913]: I1001 12:54:40.825083 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80c570d0-d665-4680-a6e5-b4c7734a87af" path="/var/lib/kubelet/pods/80c570d0-d665-4680-a6e5-b4c7734a87af/volumes" Oct 01 12:54:40 crc kubenswrapper[4913]: I1001 12:54:40.825979 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f58bd451-a408-4ec8-908e-255afe71b949" path="/var/lib/kubelet/pods/f58bd451-a408-4ec8-908e-255afe71b949/volumes" Oct 01 12:54:41 crc kubenswrapper[4913]: W1001 12:54:41.628479 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda29fe08d_d79a_48ba_b8b8_67eda446e3c6.slice/crio-201a86978786cee44645669010f0c5baa2742f7f0284a69477e938a433458ce7 WatchSource:0}: Error finding container 201a86978786cee44645669010f0c5baa2742f7f0284a69477e938a433458ce7: Status 404 returned error can't find the container with id 201a86978786cee44645669010f0c5baa2742f7f0284a69477e938a433458ce7 Oct 01 12:54:41 crc kubenswrapper[4913]: I1001 12:54:41.640217 4913 scope.go:117] "RemoveContainer" containerID="c63efd07dbfbb67af7f524c5157345f80bdc3e9222146bf65e9bc5fc64c117c7" Oct 01 12:54:42 crc kubenswrapper[4913]: I1001 12:54:42.157907 4913 scope.go:117] "RemoveContainer" containerID="af6f73bb43202e329df9438d340694a958048d36a7d533a6a92abb647e5e616f" Oct 01 12:54:42 crc kubenswrapper[4913]: I1001 12:54:42.199827 4913 scope.go:117] "RemoveContainer" containerID="a39b8415de589fc7e47a35723f6b40f66128e99e48801a4dee0b5e52069ed0b8" Oct 01 12:54:42 crc kubenswrapper[4913]: I1001 12:54:42.437513 4913 scope.go:117] "RemoveContainer" containerID="b267d813692ffffb75e7b38084d91ce1ec41224bd100ec447c3e0afb2af3bdb7" Oct 01 12:54:42 crc kubenswrapper[4913]: I1001 12:54:42.469213 4913 scope.go:117] "RemoveContainer" containerID="6cee38755edc4145b698c1dd532389282c8aaf423b9ffc8f989f4cbf59368f73" Oct 01 12:54:42 crc kubenswrapper[4913]: I1001 12:54:42.475066 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-szknr" event={"ID":"3b038340-cef3-419a-a1e2-2aa46a7f3ee6","Type":"ContainerStarted","Data":"c868668574f49e75abdcb7f6f977a2e4d3d3d10bfbc90936e59801d010e6d903"} Oct 01 12:54:42 crc kubenswrapper[4913]: I1001 12:54:42.478286 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66d867dfb6-r9zrq" event={"ID":"11336389-1acf-4342-b478-e11f04e7848d","Type":"ContainerStarted","Data":"5e9982279ce139a7f43b5878fe7d2ac34a910398d09ea111afc3595a6f70b94d"} Oct 01 12:54:42 crc kubenswrapper[4913]: I1001 12:54:42.481841 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d8f89cd7f-sqjg8" event={"ID":"a29fe08d-d79a-48ba-b8b8-67eda446e3c6","Type":"ContainerStarted","Data":"201a86978786cee44645669010f0c5baa2742f7f0284a69477e938a433458ce7"} Oct 01 12:54:42 crc kubenswrapper[4913]: I1001 12:54:42.501352 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-szknr" podStartSLOduration=34.501330038 podStartE2EDuration="34.501330038s" podCreationTimestamp="2025-10-01 12:54:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:42.491007026 +0000 UTC m=+1014.394482644" watchObservedRunningTime="2025-10-01 12:54:42.501330038 +0000 UTC m=+1014.404805626" Oct 01 12:54:42 crc kubenswrapper[4913]: I1001 12:54:42.649907 4913 scope.go:117] "RemoveContainer" containerID="fbbb0bf19b5d08e8e2c6509bed9f36d1976dd36486be4781f3b2b6bb8b726623" Oct 01 12:54:42 crc kubenswrapper[4913]: E1001 12:54:42.902757 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="e43a3fef-6dd6-4239-b6de-028dfa7145fc" Oct 01 12:54:43 crc kubenswrapper[4913]: I1001 12:54:43.496904 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e43a3fef-6dd6-4239-b6de-028dfa7145fc","Type":"ContainerStarted","Data":"3f3b102a1834a3a2ce94bb62e9464ac0ced13f626ccb1711185f90f4907a3a0f"} Oct 01 12:54:43 crc kubenswrapper[4913]: I1001 12:54:43.496995 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e43a3fef-6dd6-4239-b6de-028dfa7145fc" containerName="ceilometer-notification-agent" containerID="cri-o://1ffea247c10bf8f562015e6f656838b869ab34d8fb3f784d4dd1674ae546a61a" gracePeriod=30 Oct 01 12:54:43 crc kubenswrapper[4913]: I1001 12:54:43.497088 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:54:43 crc kubenswrapper[4913]: I1001 12:54:43.497225 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e43a3fef-6dd6-4239-b6de-028dfa7145fc" containerName="proxy-httpd" containerID="cri-o://3f3b102a1834a3a2ce94bb62e9464ac0ced13f626ccb1711185f90f4907a3a0f" gracePeriod=30 Oct 01 12:54:43 crc kubenswrapper[4913]: I1001 12:54:43.504540 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b6l67" event={"ID":"57296118-560c-4764-b94a-472d8467f7c0","Type":"ContainerStarted","Data":"6e2d93f6b1e95834d602362606c94652fbad38050142bcb478336afa63092c4e"} Oct 01 12:54:43 crc kubenswrapper[4913]: I1001 12:54:43.507411 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66d867dfb6-r9zrq" event={"ID":"11336389-1acf-4342-b478-e11f04e7848d","Type":"ContainerStarted","Data":"a9802a3f65a62d1181785834daa8ab906ab353aecba2eca1dd86d32e8ccf8783"} Oct 01 12:54:43 crc kubenswrapper[4913]: I1001 12:54:43.507577 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:43 crc kubenswrapper[4913]: I1001 12:54:43.511878 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d8f89cd7f-sqjg8" event={"ID":"a29fe08d-d79a-48ba-b8b8-67eda446e3c6","Type":"ContainerStarted","Data":"1597933bb45776a3e79a6599fa6db6ea1cc883003627f0fded4e4dfabe63a008"} Oct 01 12:54:43 crc kubenswrapper[4913]: I1001 12:54:43.601876 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-b6l67" podStartSLOduration=13.009011316 podStartE2EDuration="1m12.601859225s" podCreationTimestamp="2025-10-01 12:53:31 +0000 UTC" firstStartedPulling="2025-10-01 12:53:35.141762382 +0000 UTC m=+947.045237960" lastFinishedPulling="2025-10-01 12:54:34.734610291 +0000 UTC m=+1006.638085869" observedRunningTime="2025-10-01 12:54:43.554771185 +0000 UTC m=+1015.458246763" watchObservedRunningTime="2025-10-01 12:54:43.601859225 +0000 UTC m=+1015.505334803" Oct 01 12:54:43 crc kubenswrapper[4913]: I1001 12:54:43.603378 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7d8f89cd7f-sqjg8" podStartSLOduration=24.603370727 podStartE2EDuration="24.603370727s" podCreationTimestamp="2025-10-01 12:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:43.589261841 +0000 UTC m=+1015.492737459" watchObservedRunningTime="2025-10-01 12:54:43.603370727 +0000 UTC m=+1015.506846295" Oct 01 12:54:43 crc kubenswrapper[4913]: I1001 12:54:43.623368 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-66d867dfb6-r9zrq" podStartSLOduration=37.623349284 podStartE2EDuration="37.623349284s" podCreationTimestamp="2025-10-01 12:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:43.620881127 +0000 UTC m=+1015.524356745" watchObservedRunningTime="2025-10-01 12:54:43.623349284 +0000 UTC m=+1015.526824862" Oct 01 12:54:44 crc kubenswrapper[4913]: I1001 12:54:44.537070 4913 generic.go:334] "Generic (PLEG): container finished" podID="e43a3fef-6dd6-4239-b6de-028dfa7145fc" containerID="3f3b102a1834a3a2ce94bb62e9464ac0ced13f626ccb1711185f90f4907a3a0f" exitCode=0 Oct 01 12:54:44 crc kubenswrapper[4913]: I1001 12:54:44.537611 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e43a3fef-6dd6-4239-b6de-028dfa7145fc","Type":"ContainerDied","Data":"3f3b102a1834a3a2ce94bb62e9464ac0ced13f626ccb1711185f90f4907a3a0f"} Oct 01 12:54:44 crc kubenswrapper[4913]: I1001 12:54:44.537882 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:54:44 crc kubenswrapper[4913]: I1001 12:54:44.537930 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.148392 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.235235 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e43a3fef-6dd6-4239-b6de-028dfa7145fc-run-httpd\") pod \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.235323 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-combined-ca-bundle\") pod \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.235405 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e43a3fef-6dd6-4239-b6de-028dfa7145fc-log-httpd\") pod \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.235501 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6gmq\" (UniqueName: \"kubernetes.io/projected/e43a3fef-6dd6-4239-b6de-028dfa7145fc-kube-api-access-q6gmq\") pod \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.235586 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-sg-core-conf-yaml\") pod \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.235636 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-scripts\") pod \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.235674 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-config-data\") pod \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\" (UID: \"e43a3fef-6dd6-4239-b6de-028dfa7145fc\") " Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.235908 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e43a3fef-6dd6-4239-b6de-028dfa7145fc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e43a3fef-6dd6-4239-b6de-028dfa7145fc" (UID: "e43a3fef-6dd6-4239-b6de-028dfa7145fc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.236090 4913 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e43a3fef-6dd6-4239-b6de-028dfa7145fc-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.237777 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e43a3fef-6dd6-4239-b6de-028dfa7145fc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e43a3fef-6dd6-4239-b6de-028dfa7145fc" (UID: "e43a3fef-6dd6-4239-b6de-028dfa7145fc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.242207 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e43a3fef-6dd6-4239-b6de-028dfa7145fc" (UID: "e43a3fef-6dd6-4239-b6de-028dfa7145fc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.243050 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-scripts" (OuterVolumeSpecName: "scripts") pod "e43a3fef-6dd6-4239-b6de-028dfa7145fc" (UID: "e43a3fef-6dd6-4239-b6de-028dfa7145fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.244673 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e43a3fef-6dd6-4239-b6de-028dfa7145fc-kube-api-access-q6gmq" (OuterVolumeSpecName: "kube-api-access-q6gmq") pod "e43a3fef-6dd6-4239-b6de-028dfa7145fc" (UID: "e43a3fef-6dd6-4239-b6de-028dfa7145fc"). InnerVolumeSpecName "kube-api-access-q6gmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.295718 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e43a3fef-6dd6-4239-b6de-028dfa7145fc" (UID: "e43a3fef-6dd6-4239-b6de-028dfa7145fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.337416 4913 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.337447 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.337459 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.337469 4913 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e43a3fef-6dd6-4239-b6de-028dfa7145fc-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.337481 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6gmq\" (UniqueName: \"kubernetes.io/projected/e43a3fef-6dd6-4239-b6de-028dfa7145fc-kube-api-access-q6gmq\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.348458 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-config-data" (OuterVolumeSpecName: "config-data") pod "e43a3fef-6dd6-4239-b6de-028dfa7145fc" (UID: "e43a3fef-6dd6-4239-b6de-028dfa7145fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.439443 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e43a3fef-6dd6-4239-b6de-028dfa7145fc-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.565871 4913 generic.go:334] "Generic (PLEG): container finished" podID="e43a3fef-6dd6-4239-b6de-028dfa7145fc" containerID="1ffea247c10bf8f562015e6f656838b869ab34d8fb3f784d4dd1674ae546a61a" exitCode=0 Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.565929 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e43a3fef-6dd6-4239-b6de-028dfa7145fc","Type":"ContainerDied","Data":"1ffea247c10bf8f562015e6f656838b869ab34d8fb3f784d4dd1674ae546a61a"} Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.565968 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e43a3fef-6dd6-4239-b6de-028dfa7145fc","Type":"ContainerDied","Data":"27f4c22dfc763acad225aad9466c734126458adcf4287ab84e44ff70d73ab501"} Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.565993 4913 scope.go:117] "RemoveContainer" containerID="3f3b102a1834a3a2ce94bb62e9464ac0ced13f626ccb1711185f90f4907a3a0f" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.566402 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.595113 4913 scope.go:117] "RemoveContainer" containerID="1ffea247c10bf8f562015e6f656838b869ab34d8fb3f784d4dd1674ae546a61a" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.624730 4913 scope.go:117] "RemoveContainer" containerID="3f3b102a1834a3a2ce94bb62e9464ac0ced13f626ccb1711185f90f4907a3a0f" Oct 01 12:54:47 crc kubenswrapper[4913]: E1001 12:54:47.625634 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f3b102a1834a3a2ce94bb62e9464ac0ced13f626ccb1711185f90f4907a3a0f\": container with ID starting with 3f3b102a1834a3a2ce94bb62e9464ac0ced13f626ccb1711185f90f4907a3a0f not found: ID does not exist" containerID="3f3b102a1834a3a2ce94bb62e9464ac0ced13f626ccb1711185f90f4907a3a0f" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.625669 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f3b102a1834a3a2ce94bb62e9464ac0ced13f626ccb1711185f90f4907a3a0f"} err="failed to get container status \"3f3b102a1834a3a2ce94bb62e9464ac0ced13f626ccb1711185f90f4907a3a0f\": rpc error: code = NotFound desc = could not find container \"3f3b102a1834a3a2ce94bb62e9464ac0ced13f626ccb1711185f90f4907a3a0f\": container with ID starting with 3f3b102a1834a3a2ce94bb62e9464ac0ced13f626ccb1711185f90f4907a3a0f not found: ID does not exist" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.625694 4913 scope.go:117] "RemoveContainer" containerID="1ffea247c10bf8f562015e6f656838b869ab34d8fb3f784d4dd1674ae546a61a" Oct 01 12:54:47 crc kubenswrapper[4913]: E1001 12:54:47.626045 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ffea247c10bf8f562015e6f656838b869ab34d8fb3f784d4dd1674ae546a61a\": container with ID starting with 1ffea247c10bf8f562015e6f656838b869ab34d8fb3f784d4dd1674ae546a61a not found: ID does not exist" containerID="1ffea247c10bf8f562015e6f656838b869ab34d8fb3f784d4dd1674ae546a61a" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.626072 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ffea247c10bf8f562015e6f656838b869ab34d8fb3f784d4dd1674ae546a61a"} err="failed to get container status \"1ffea247c10bf8f562015e6f656838b869ab34d8fb3f784d4dd1674ae546a61a\": rpc error: code = NotFound desc = could not find container \"1ffea247c10bf8f562015e6f656838b869ab34d8fb3f784d4dd1674ae546a61a\": container with ID starting with 1ffea247c10bf8f562015e6f656838b869ab34d8fb3f784d4dd1674ae546a61a not found: ID does not exist" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.633299 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.649190 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.670470 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:47 crc kubenswrapper[4913]: E1001 12:54:47.672641 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e43a3fef-6dd6-4239-b6de-028dfa7145fc" containerName="proxy-httpd" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.672676 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43a3fef-6dd6-4239-b6de-028dfa7145fc" containerName="proxy-httpd" Oct 01 12:54:47 crc kubenswrapper[4913]: E1001 12:54:47.672701 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f58bd451-a408-4ec8-908e-255afe71b949" containerName="horizon-log" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.672709 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58bd451-a408-4ec8-908e-255afe71b949" containerName="horizon-log" Oct 01 12:54:47 crc kubenswrapper[4913]: E1001 12:54:47.672729 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c570d0-d665-4680-a6e5-b4c7734a87af" containerName="horizon-log" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.672738 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c570d0-d665-4680-a6e5-b4c7734a87af" containerName="horizon-log" Oct 01 12:54:47 crc kubenswrapper[4913]: E1001 12:54:47.672750 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c570d0-d665-4680-a6e5-b4c7734a87af" containerName="horizon" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.672759 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c570d0-d665-4680-a6e5-b4c7734a87af" containerName="horizon" Oct 01 12:54:47 crc kubenswrapper[4913]: E1001 12:54:47.672791 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f29a2a3-85a3-40a0-ab42-1e575dea129c" containerName="dnsmasq-dns" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.672800 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f29a2a3-85a3-40a0-ab42-1e575dea129c" containerName="dnsmasq-dns" Oct 01 12:54:47 crc kubenswrapper[4913]: E1001 12:54:47.672830 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e43a3fef-6dd6-4239-b6de-028dfa7145fc" containerName="ceilometer-notification-agent" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.672839 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43a3fef-6dd6-4239-b6de-028dfa7145fc" containerName="ceilometer-notification-agent" Oct 01 12:54:47 crc kubenswrapper[4913]: E1001 12:54:47.672868 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f29a2a3-85a3-40a0-ab42-1e575dea129c" containerName="init" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.672877 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f29a2a3-85a3-40a0-ab42-1e575dea129c" containerName="init" Oct 01 12:54:47 crc kubenswrapper[4913]: E1001 12:54:47.673790 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a601b3-1f16-4504-bc7c-aa573c34764e" containerName="horizon" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.673822 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a601b3-1f16-4504-bc7c-aa573c34764e" containerName="horizon" Oct 01 12:54:47 crc kubenswrapper[4913]: E1001 12:54:47.673855 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f58bd451-a408-4ec8-908e-255afe71b949" containerName="horizon" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.673864 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58bd451-a408-4ec8-908e-255afe71b949" containerName="horizon" Oct 01 12:54:47 crc kubenswrapper[4913]: E1001 12:54:47.673885 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a601b3-1f16-4504-bc7c-aa573c34764e" containerName="horizon-log" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.673893 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a601b3-1f16-4504-bc7c-aa573c34764e" containerName="horizon-log" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.674474 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="f58bd451-a408-4ec8-908e-255afe71b949" containerName="horizon-log" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.674521 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="80c570d0-d665-4680-a6e5-b4c7734a87af" containerName="horizon-log" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.674701 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f29a2a3-85a3-40a0-ab42-1e575dea129c" containerName="dnsmasq-dns" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.674738 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="e43a3fef-6dd6-4239-b6de-028dfa7145fc" containerName="ceilometer-notification-agent" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.674765 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a601b3-1f16-4504-bc7c-aa573c34764e" containerName="horizon-log" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.674791 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="f58bd451-a408-4ec8-908e-255afe71b949" containerName="horizon" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.674819 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="e43a3fef-6dd6-4239-b6de-028dfa7145fc" containerName="proxy-httpd" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.674830 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="80c570d0-d665-4680-a6e5-b4c7734a87af" containerName="horizon" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.674852 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a601b3-1f16-4504-bc7c-aa573c34764e" containerName="horizon" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.680041 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.682399 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.682464 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.692247 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.845821 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-config-data\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.845896 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ed98c9-9585-44d2-a913-ebdcfa04ac53-run-httpd\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.845927 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72g8x\" (UniqueName: \"kubernetes.io/projected/51ed98c9-9585-44d2-a913-ebdcfa04ac53-kube-api-access-72g8x\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.845944 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ed98c9-9585-44d2-a913-ebdcfa04ac53-log-httpd\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.845958 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.845977 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.846002 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-scripts\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.947912 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-config-data\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.948000 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ed98c9-9585-44d2-a913-ebdcfa04ac53-run-httpd\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.948068 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72g8x\" (UniqueName: \"kubernetes.io/projected/51ed98c9-9585-44d2-a913-ebdcfa04ac53-kube-api-access-72g8x\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.948100 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ed98c9-9585-44d2-a913-ebdcfa04ac53-log-httpd\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.948124 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.948185 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.948206 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-scripts\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.949707 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ed98c9-9585-44d2-a913-ebdcfa04ac53-run-httpd\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.950756 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ed98c9-9585-44d2-a913-ebdcfa04ac53-log-httpd\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.955377 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.955459 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-scripts\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.955778 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-config-data\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.955886 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:47 crc kubenswrapper[4913]: I1001 12:54:47.965950 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72g8x\" (UniqueName: \"kubernetes.io/projected/51ed98c9-9585-44d2-a913-ebdcfa04ac53-kube-api-access-72g8x\") pod \"ceilometer-0\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " pod="openstack/ceilometer-0" Oct 01 12:54:48 crc kubenswrapper[4913]: I1001 12:54:48.004641 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:48 crc kubenswrapper[4913]: I1001 12:54:48.472168 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:48 crc kubenswrapper[4913]: W1001 12:54:48.477851 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51ed98c9_9585_44d2_a913_ebdcfa04ac53.slice/crio-05d7293ca7da7b6b95bc2a89a9735bd79134c08b7644776ae078775c97025fc5 WatchSource:0}: Error finding container 05d7293ca7da7b6b95bc2a89a9735bd79134c08b7644776ae078775c97025fc5: Status 404 returned error can't find the container with id 05d7293ca7da7b6b95bc2a89a9735bd79134c08b7644776ae078775c97025fc5 Oct 01 12:54:48 crc kubenswrapper[4913]: I1001 12:54:48.575197 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ed98c9-9585-44d2-a913-ebdcfa04ac53","Type":"ContainerStarted","Data":"05d7293ca7da7b6b95bc2a89a9735bd79134c08b7644776ae078775c97025fc5"} Oct 01 12:54:48 crc kubenswrapper[4913]: I1001 12:54:48.816298 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e43a3fef-6dd6-4239-b6de-028dfa7145fc" path="/var/lib/kubelet/pods/e43a3fef-6dd6-4239-b6de-028dfa7145fc/volumes" Oct 01 12:54:49 crc kubenswrapper[4913]: I1001 12:54:49.591673 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qzddd" event={"ID":"706c5fb0-a691-4f92-bb4e-a6ba720abfa1","Type":"ContainerStarted","Data":"56600f195a007cdc232b8005819f8754c4b696d5bdab718fbac42ed771d8ffe2"} Oct 01 12:54:49 crc kubenswrapper[4913]: I1001 12:54:49.595552 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ed98c9-9585-44d2-a913-ebdcfa04ac53","Type":"ContainerStarted","Data":"f32d5070ff8f0ead1639f29d9e08814d62604cfbb04d189becaa4cf950263da1"} Oct 01 12:54:49 crc kubenswrapper[4913]: I1001 12:54:49.613606 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-qzddd" podStartSLOduration=20.739816978 podStartE2EDuration="1m6.613585397s" podCreationTimestamp="2025-10-01 12:53:43 +0000 UTC" firstStartedPulling="2025-10-01 12:54:02.443260081 +0000 UTC m=+974.346735659" lastFinishedPulling="2025-10-01 12:54:48.3170285 +0000 UTC m=+1020.220504078" observedRunningTime="2025-10-01 12:54:49.606823562 +0000 UTC m=+1021.510299170" watchObservedRunningTime="2025-10-01 12:54:49.613585397 +0000 UTC m=+1021.517060995" Oct 01 12:54:50 crc kubenswrapper[4913]: I1001 12:54:50.603022 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x9gmn" event={"ID":"0f1ef1c4-7a72-4569-b21c-ef13cb766d25","Type":"ContainerStarted","Data":"c744d3854adb67bd38db6954675af051c3c83181adcd76ae4cf902629db30151"} Oct 01 12:54:50 crc kubenswrapper[4913]: I1001 12:54:50.625284 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-x9gmn" podStartSLOduration=20.146008703 podStartE2EDuration="1m7.62525495s" podCreationTimestamp="2025-10-01 12:53:43 +0000 UTC" firstStartedPulling="2025-10-01 12:54:02.440203438 +0000 UTC m=+974.343679016" lastFinishedPulling="2025-10-01 12:54:49.919449685 +0000 UTC m=+1021.822925263" observedRunningTime="2025-10-01 12:54:50.620906181 +0000 UTC m=+1022.524381769" watchObservedRunningTime="2025-10-01 12:54:50.62525495 +0000 UTC m=+1022.528730528" Oct 01 12:54:51 crc kubenswrapper[4913]: I1001 12:54:51.249501 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7d8f89cd7f-sqjg8" Oct 01 12:54:51 crc kubenswrapper[4913]: I1001 12:54:51.612906 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ed98c9-9585-44d2-a913-ebdcfa04ac53","Type":"ContainerStarted","Data":"5dc12d816508368eb5a3aadbf766c998c8622ed86236bdb278cd12573e773dcc"} Oct 01 12:54:51 crc kubenswrapper[4913]: I1001 12:54:51.613160 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ed98c9-9585-44d2-a913-ebdcfa04ac53","Type":"ContainerStarted","Data":"8e1504779d7618297122a847a998455f5875d6bf883d208ceecb539655235cef"} Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.562931 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.565210 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.568248 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.568288 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.576105 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7f9mx" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.583025 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.637440 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg6bw\" (UniqueName: \"kubernetes.io/projected/fe4614c3-9118-41ab-be00-667f0bbca6bb-kube-api-access-qg6bw\") pod \"openstackclient\" (UID: \"fe4614c3-9118-41ab-be00-667f0bbca6bb\") " pod="openstack/openstackclient" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.637551 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fe4614c3-9118-41ab-be00-667f0bbca6bb-openstack-config\") pod \"openstackclient\" (UID: \"fe4614c3-9118-41ab-be00-667f0bbca6bb\") " pod="openstack/openstackclient" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.637591 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fe4614c3-9118-41ab-be00-667f0bbca6bb-openstack-config-secret\") pod \"openstackclient\" (UID: \"fe4614c3-9118-41ab-be00-667f0bbca6bb\") " pod="openstack/openstackclient" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.637784 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe4614c3-9118-41ab-be00-667f0bbca6bb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fe4614c3-9118-41ab-be00-667f0bbca6bb\") " pod="openstack/openstackclient" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.739879 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fe4614c3-9118-41ab-be00-667f0bbca6bb-openstack-config\") pod \"openstackclient\" (UID: \"fe4614c3-9118-41ab-be00-667f0bbca6bb\") " pod="openstack/openstackclient" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.739961 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fe4614c3-9118-41ab-be00-667f0bbca6bb-openstack-config-secret\") pod \"openstackclient\" (UID: \"fe4614c3-9118-41ab-be00-667f0bbca6bb\") " pod="openstack/openstackclient" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.740019 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe4614c3-9118-41ab-be00-667f0bbca6bb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fe4614c3-9118-41ab-be00-667f0bbca6bb\") " pod="openstack/openstackclient" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.740313 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg6bw\" (UniqueName: \"kubernetes.io/projected/fe4614c3-9118-41ab-be00-667f0bbca6bb-kube-api-access-qg6bw\") pod \"openstackclient\" (UID: \"fe4614c3-9118-41ab-be00-667f0bbca6bb\") " pod="openstack/openstackclient" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.741476 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fe4614c3-9118-41ab-be00-667f0bbca6bb-openstack-config\") pod \"openstackclient\" (UID: \"fe4614c3-9118-41ab-be00-667f0bbca6bb\") " pod="openstack/openstackclient" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.748471 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fe4614c3-9118-41ab-be00-667f0bbca6bb-openstack-config-secret\") pod \"openstackclient\" (UID: \"fe4614c3-9118-41ab-be00-667f0bbca6bb\") " pod="openstack/openstackclient" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.764086 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe4614c3-9118-41ab-be00-667f0bbca6bb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fe4614c3-9118-41ab-be00-667f0bbca6bb\") " pod="openstack/openstackclient" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.766608 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg6bw\" (UniqueName: \"kubernetes.io/projected/fe4614c3-9118-41ab-be00-667f0bbca6bb-kube-api-access-qg6bw\") pod \"openstackclient\" (UID: \"fe4614c3-9118-41ab-be00-667f0bbca6bb\") " pod="openstack/openstackclient" Oct 01 12:54:52 crc kubenswrapper[4913]: I1001 12:54:52.886074 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 12:54:53 crc kubenswrapper[4913]: I1001 12:54:53.364502 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 12:54:53 crc kubenswrapper[4913]: I1001 12:54:53.630817 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ed98c9-9585-44d2-a913-ebdcfa04ac53","Type":"ContainerStarted","Data":"6bae22fb080983f57bd130e4b1d7119ad7602dcbff44013e501552b5bcdc8f70"} Oct 01 12:54:53 crc kubenswrapper[4913]: I1001 12:54:53.631154 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:54:53 crc kubenswrapper[4913]: I1001 12:54:53.631731 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fe4614c3-9118-41ab-be00-667f0bbca6bb","Type":"ContainerStarted","Data":"ba43403576cc6567a1be857e21c88eb0dc44f61f1225ab2f2b4ec5728131f7d3"} Oct 01 12:54:53 crc kubenswrapper[4913]: I1001 12:54:53.662432 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.282365694 podStartE2EDuration="6.662416738s" podCreationTimestamp="2025-10-01 12:54:47 +0000 UTC" firstStartedPulling="2025-10-01 12:54:48.480916169 +0000 UTC m=+1020.384391747" lastFinishedPulling="2025-10-01 12:54:52.860967213 +0000 UTC m=+1024.764442791" observedRunningTime="2025-10-01 12:54:53.66064199 +0000 UTC m=+1025.564117598" watchObservedRunningTime="2025-10-01 12:54:53.662416738 +0000 UTC m=+1025.565892316" Oct 01 12:54:56 crc kubenswrapper[4913]: I1001 12:54:56.676033 4913 generic.go:334] "Generic (PLEG): container finished" podID="0f1ef1c4-7a72-4569-b21c-ef13cb766d25" containerID="c744d3854adb67bd38db6954675af051c3c83181adcd76ae4cf902629db30151" exitCode=0 Oct 01 12:54:56 crc kubenswrapper[4913]: I1001 12:54:56.676113 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x9gmn" event={"ID":"0f1ef1c4-7a72-4569-b21c-ef13cb766d25","Type":"ContainerDied","Data":"c744d3854adb67bd38db6954675af051c3c83181adcd76ae4cf902629db30151"} Oct 01 12:54:58 crc kubenswrapper[4913]: I1001 12:54:58.696612 4913 generic.go:334] "Generic (PLEG): container finished" podID="706c5fb0-a691-4f92-bb4e-a6ba720abfa1" containerID="56600f195a007cdc232b8005819f8754c4b696d5bdab718fbac42ed771d8ffe2" exitCode=0 Oct 01 12:54:58 crc kubenswrapper[4913]: I1001 12:54:58.696724 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qzddd" event={"ID":"706c5fb0-a691-4f92-bb4e-a6ba720abfa1","Type":"ContainerDied","Data":"56600f195a007cdc232b8005819f8754c4b696d5bdab718fbac42ed771d8ffe2"} Oct 01 12:55:00 crc kubenswrapper[4913]: I1001 12:55:00.889803 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hkhhg"] Oct 01 12:55:00 crc kubenswrapper[4913]: I1001 12:55:00.891314 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hkhhg" Oct 01 12:55:00 crc kubenswrapper[4913]: I1001 12:55:00.900064 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hkhhg"] Oct 01 12:55:00 crc kubenswrapper[4913]: I1001 12:55:00.996482 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s25fb\" (UniqueName: \"kubernetes.io/projected/4983dc8d-2950-45be-9bd3-33f5e24d52ef-kube-api-access-s25fb\") pod \"nova-api-db-create-hkhhg\" (UID: \"4983dc8d-2950-45be-9bd3-33f5e24d52ef\") " pod="openstack/nova-api-db-create-hkhhg" Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.097659 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s25fb\" (UniqueName: \"kubernetes.io/projected/4983dc8d-2950-45be-9bd3-33f5e24d52ef-kube-api-access-s25fb\") pod \"nova-api-db-create-hkhhg\" (UID: \"4983dc8d-2950-45be-9bd3-33f5e24d52ef\") " pod="openstack/nova-api-db-create-hkhhg" Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.098629 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-g8j5s"] Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.099675 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-g8j5s" Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.114380 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-g8j5s"] Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.131047 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s25fb\" (UniqueName: \"kubernetes.io/projected/4983dc8d-2950-45be-9bd3-33f5e24d52ef-kube-api-access-s25fb\") pod \"nova-api-db-create-hkhhg\" (UID: \"4983dc8d-2950-45be-9bd3-33f5e24d52ef\") " pod="openstack/nova-api-db-create-hkhhg" Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.187631 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-bf4s9"] Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.189284 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bf4s9" Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.198807 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k949\" (UniqueName: \"kubernetes.io/projected/3fa06e3e-e04d-481f-87e0-a55d168994f7-kube-api-access-2k949\") pod \"nova-cell0-db-create-g8j5s\" (UID: \"3fa06e3e-e04d-481f-87e0-a55d168994f7\") " pod="openstack/nova-cell0-db-create-g8j5s" Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.211031 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bf4s9"] Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.213792 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hkhhg" Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.301119 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42mrg\" (UniqueName: \"kubernetes.io/projected/82163264-a7e8-4183-b162-8ddabbce7f39-kube-api-access-42mrg\") pod \"nova-cell1-db-create-bf4s9\" (UID: \"82163264-a7e8-4183-b162-8ddabbce7f39\") " pod="openstack/nova-cell1-db-create-bf4s9" Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.301316 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k949\" (UniqueName: \"kubernetes.io/projected/3fa06e3e-e04d-481f-87e0-a55d168994f7-kube-api-access-2k949\") pod \"nova-cell0-db-create-g8j5s\" (UID: \"3fa06e3e-e04d-481f-87e0-a55d168994f7\") " pod="openstack/nova-cell0-db-create-g8j5s" Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.330909 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k949\" (UniqueName: \"kubernetes.io/projected/3fa06e3e-e04d-481f-87e0-a55d168994f7-kube-api-access-2k949\") pod \"nova-cell0-db-create-g8j5s\" (UID: \"3fa06e3e-e04d-481f-87e0-a55d168994f7\") " pod="openstack/nova-cell0-db-create-g8j5s" Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.402693 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42mrg\" (UniqueName: \"kubernetes.io/projected/82163264-a7e8-4183-b162-8ddabbce7f39-kube-api-access-42mrg\") pod \"nova-cell1-db-create-bf4s9\" (UID: \"82163264-a7e8-4183-b162-8ddabbce7f39\") " pod="openstack/nova-cell1-db-create-bf4s9" Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.417537 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-g8j5s" Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.421235 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42mrg\" (UniqueName: \"kubernetes.io/projected/82163264-a7e8-4183-b162-8ddabbce7f39-kube-api-access-42mrg\") pod \"nova-cell1-db-create-bf4s9\" (UID: \"82163264-a7e8-4183-b162-8ddabbce7f39\") " pod="openstack/nova-cell1-db-create-bf4s9" Oct 01 12:55:01 crc kubenswrapper[4913]: I1001 12:55:01.510853 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bf4s9" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.730203 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x9gmn" event={"ID":"0f1ef1c4-7a72-4569-b21c-ef13cb766d25","Type":"ContainerDied","Data":"125d90f0e82b07a0a7bb4c0b5c741f8357fe680f7b8269fe94f38423f4ce2229"} Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.730911 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="125d90f0e82b07a0a7bb4c0b5c741f8357fe680f7b8269fe94f38423f4ce2229" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.732043 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qzddd" event={"ID":"706c5fb0-a691-4f92-bb4e-a6ba720abfa1","Type":"ContainerDied","Data":"272efc5b1b50c78ce94135ef75305fa56f423025c9cba7737c31de53cb023d8d"} Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.732075 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="272efc5b1b50c78ce94135ef75305fa56f423025c9cba7737c31de53cb023d8d" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.747714 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qzddd" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.752884 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x9gmn" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.828022 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-db-sync-config-data\") pod \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.828063 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swkz5\" (UniqueName: \"kubernetes.io/projected/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-kube-api-access-swkz5\") pod \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.828120 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-scripts\") pod \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.828144 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-combined-ca-bundle\") pod \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\" (UID: \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\") " Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.828195 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-combined-ca-bundle\") pod \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.828237 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfztb\" (UniqueName: \"kubernetes.io/projected/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-kube-api-access-wfztb\") pod \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\" (UID: \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\") " Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.828256 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-etc-machine-id\") pod \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.828333 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-config-data\") pod \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\" (UID: \"706c5fb0-a691-4f92-bb4e-a6ba720abfa1\") " Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.828357 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-db-sync-config-data\") pod \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\" (UID: \"0f1ef1c4-7a72-4569-b21c-ef13cb766d25\") " Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.829107 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "706c5fb0-a691-4f92-bb4e-a6ba720abfa1" (UID: "706c5fb0-a691-4f92-bb4e-a6ba720abfa1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.834737 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-kube-api-access-wfztb" (OuterVolumeSpecName: "kube-api-access-wfztb") pod "0f1ef1c4-7a72-4569-b21c-ef13cb766d25" (UID: "0f1ef1c4-7a72-4569-b21c-ef13cb766d25"). InnerVolumeSpecName "kube-api-access-wfztb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.835092 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "706c5fb0-a691-4f92-bb4e-a6ba720abfa1" (UID: "706c5fb0-a691-4f92-bb4e-a6ba720abfa1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.835202 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-kube-api-access-swkz5" (OuterVolumeSpecName: "kube-api-access-swkz5") pod "706c5fb0-a691-4f92-bb4e-a6ba720abfa1" (UID: "706c5fb0-a691-4f92-bb4e-a6ba720abfa1"). InnerVolumeSpecName "kube-api-access-swkz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.835590 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0f1ef1c4-7a72-4569-b21c-ef13cb766d25" (UID: "0f1ef1c4-7a72-4569-b21c-ef13cb766d25"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.836998 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-scripts" (OuterVolumeSpecName: "scripts") pod "706c5fb0-a691-4f92-bb4e-a6ba720abfa1" (UID: "706c5fb0-a691-4f92-bb4e-a6ba720abfa1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.859797 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f1ef1c4-7a72-4569-b21c-ef13cb766d25" (UID: "0f1ef1c4-7a72-4569-b21c-ef13cb766d25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.862481 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "706c5fb0-a691-4f92-bb4e-a6ba720abfa1" (UID: "706c5fb0-a691-4f92-bb4e-a6ba720abfa1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.876720 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-config-data" (OuterVolumeSpecName: "config-data") pod "706c5fb0-a691-4f92-bb4e-a6ba720abfa1" (UID: "706c5fb0-a691-4f92-bb4e-a6ba720abfa1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.930254 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.930297 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.930308 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.930316 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfztb\" (UniqueName: \"kubernetes.io/projected/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-kube-api-access-wfztb\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.930327 4913 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.930335 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.930343 4913 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f1ef1c4-7a72-4569-b21c-ef13cb766d25-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.930351 4913 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.930359 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swkz5\" (UniqueName: \"kubernetes.io/projected/706c5fb0-a691-4f92-bb4e-a6ba720abfa1-kube-api-access-swkz5\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:02 crc kubenswrapper[4913]: I1001 12:55:02.932527 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-g8j5s"] Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.006743 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bf4s9"] Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.084673 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hkhhg"] Oct 01 12:55:03 crc kubenswrapper[4913]: W1001 12:55:03.123118 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4983dc8d_2950_45be_9bd3_33f5e24d52ef.slice/crio-53adec0d08cc75b0a93432ae8cb829b9b927338e3379aac83e7dd8a89e2e2f0b WatchSource:0}: Error finding container 53adec0d08cc75b0a93432ae8cb829b9b927338e3379aac83e7dd8a89e2e2f0b: Status 404 returned error can't find the container with id 53adec0d08cc75b0a93432ae8cb829b9b927338e3379aac83e7dd8a89e2e2f0b Oct 01 12:55:03 crc kubenswrapper[4913]: E1001 12:55:03.330380 4913 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fa06e3e_e04d_481f_87e0_a55d168994f7.slice/crio-ff54338cd5d7745e0ab1b98057764d1aec54032102b8ec54ceb2241b8c876f42.scope\": RecentStats: unable to find data in memory cache]" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.371280 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.537910 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-logs\") pod \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.537962 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-config-data\") pod \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.537999 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-scripts\") pod \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.538030 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-horizon-tls-certs\") pod \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.538095 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-combined-ca-bundle\") pod \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.538132 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-horizon-secret-key\") pod \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.538192 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9b9j\" (UniqueName: \"kubernetes.io/projected/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-kube-api-access-v9b9j\") pod \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\" (UID: \"3c044c3c-7de5-45ec-b450-cbfaf0ca415f\") " Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.538346 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-logs" (OuterVolumeSpecName: "logs") pod "3c044c3c-7de5-45ec-b450-cbfaf0ca415f" (UID: "3c044c3c-7de5-45ec-b450-cbfaf0ca415f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.538966 4913 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.543837 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3c044c3c-7de5-45ec-b450-cbfaf0ca415f" (UID: "3c044c3c-7de5-45ec-b450-cbfaf0ca415f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.545392 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-kube-api-access-v9b9j" (OuterVolumeSpecName: "kube-api-access-v9b9j") pod "3c044c3c-7de5-45ec-b450-cbfaf0ca415f" (UID: "3c044c3c-7de5-45ec-b450-cbfaf0ca415f"). InnerVolumeSpecName "kube-api-access-v9b9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.561528 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-config-data" (OuterVolumeSpecName: "config-data") pod "3c044c3c-7de5-45ec-b450-cbfaf0ca415f" (UID: "3c044c3c-7de5-45ec-b450-cbfaf0ca415f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.567755 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-scripts" (OuterVolumeSpecName: "scripts") pod "3c044c3c-7de5-45ec-b450-cbfaf0ca415f" (UID: "3c044c3c-7de5-45ec-b450-cbfaf0ca415f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.575645 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c044c3c-7de5-45ec-b450-cbfaf0ca415f" (UID: "3c044c3c-7de5-45ec-b450-cbfaf0ca415f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.612063 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "3c044c3c-7de5-45ec-b450-cbfaf0ca415f" (UID: "3c044c3c-7de5-45ec-b450-cbfaf0ca415f"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.672808 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9b9j\" (UniqueName: \"kubernetes.io/projected/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-kube-api-access-v9b9j\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.672920 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.672990 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.673008 4913 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.673064 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.673083 4913 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3c044c3c-7de5-45ec-b450-cbfaf0ca415f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.741034 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fe4614c3-9118-41ab-be00-667f0bbca6bb","Type":"ContainerStarted","Data":"f2b640b607529dc83a005e704188636115d71e4313f4099aa98e8bb544bd04e4"} Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.742838 4913 generic.go:334] "Generic (PLEG): container finished" podID="82163264-a7e8-4183-b162-8ddabbce7f39" containerID="ba6de2ffb7c203f7458dff486130e95a87d07f1222801f2e843cb513a1548243" exitCode=0 Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.742957 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bf4s9" event={"ID":"82163264-a7e8-4183-b162-8ddabbce7f39","Type":"ContainerDied","Data":"ba6de2ffb7c203f7458dff486130e95a87d07f1222801f2e843cb513a1548243"} Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.743067 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bf4s9" event={"ID":"82163264-a7e8-4183-b162-8ddabbce7f39","Type":"ContainerStarted","Data":"386cacb7893914b1dc198252c3dce66412f70efcabf7182c6f41a6131f7130c2"} Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.745105 4913 generic.go:334] "Generic (PLEG): container finished" podID="3fa06e3e-e04d-481f-87e0-a55d168994f7" containerID="ff54338cd5d7745e0ab1b98057764d1aec54032102b8ec54ceb2241b8c876f42" exitCode=0 Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.745160 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-g8j5s" event={"ID":"3fa06e3e-e04d-481f-87e0-a55d168994f7","Type":"ContainerDied","Data":"ff54338cd5d7745e0ab1b98057764d1aec54032102b8ec54ceb2241b8c876f42"} Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.745182 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-g8j5s" event={"ID":"3fa06e3e-e04d-481f-87e0-a55d168994f7","Type":"ContainerStarted","Data":"49dab133699a55484476bd0375d6fac02a87614a397326fec1069401b3a7c2f1"} Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.748416 4913 generic.go:334] "Generic (PLEG): container finished" podID="3c044c3c-7de5-45ec-b450-cbfaf0ca415f" containerID="eb46af49a210dc34c36f5f2a0ca5ab57aaf2ea1027d59e67c8f38d9dc1d63a23" exitCode=137 Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.748529 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b6c9764d-c6wjv" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.748619 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b6c9764d-c6wjv" event={"ID":"3c044c3c-7de5-45ec-b450-cbfaf0ca415f","Type":"ContainerDied","Data":"eb46af49a210dc34c36f5f2a0ca5ab57aaf2ea1027d59e67c8f38d9dc1d63a23"} Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.748728 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b6c9764d-c6wjv" event={"ID":"3c044c3c-7de5-45ec-b450-cbfaf0ca415f","Type":"ContainerDied","Data":"612a9c8c65730a9caf3e4e4c5859019c1a76881474e0a7a2f1b470e089b60c5b"} Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.748827 4913 scope.go:117] "RemoveContainer" containerID="59ec9df9fd1874483d5b297dfb0b0925ec8363c07b37d09ae1782751e5d9fc37" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.751117 4913 generic.go:334] "Generic (PLEG): container finished" podID="4983dc8d-2950-45be-9bd3-33f5e24d52ef" containerID="0bfeade2f85027bd1c26bb56a1d886b3f1f9639a4cead38dcefcbd63b3cfd1b8" exitCode=0 Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.751182 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hkhhg" event={"ID":"4983dc8d-2950-45be-9bd3-33f5e24d52ef","Type":"ContainerDied","Data":"0bfeade2f85027bd1c26bb56a1d886b3f1f9639a4cead38dcefcbd63b3cfd1b8"} Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.751241 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hkhhg" event={"ID":"4983dc8d-2950-45be-9bd3-33f5e24d52ef","Type":"ContainerStarted","Data":"53adec0d08cc75b0a93432ae8cb829b9b927338e3379aac83e7dd8a89e2e2f0b"} Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.751317 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x9gmn" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.751335 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qzddd" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.761700 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.597009349 podStartE2EDuration="11.761684201s" podCreationTimestamp="2025-10-01 12:54:52 +0000 UTC" firstStartedPulling="2025-10-01 12:54:53.370682476 +0000 UTC m=+1025.274158054" lastFinishedPulling="2025-10-01 12:55:02.535357338 +0000 UTC m=+1034.438832906" observedRunningTime="2025-10-01 12:55:03.759607235 +0000 UTC m=+1035.663082833" watchObservedRunningTime="2025-10-01 12:55:03.761684201 +0000 UTC m=+1035.665159779" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.874605 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b6c9764d-c6wjv"] Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.879978 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b6c9764d-c6wjv"] Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.938051 4913 scope.go:117] "RemoveContainer" containerID="eb46af49a210dc34c36f5f2a0ca5ab57aaf2ea1027d59e67c8f38d9dc1d63a23" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.964804 4913 scope.go:117] "RemoveContainer" containerID="59ec9df9fd1874483d5b297dfb0b0925ec8363c07b37d09ae1782751e5d9fc37" Oct 01 12:55:03 crc kubenswrapper[4913]: E1001 12:55:03.965321 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ec9df9fd1874483d5b297dfb0b0925ec8363c07b37d09ae1782751e5d9fc37\": container with ID starting with 59ec9df9fd1874483d5b297dfb0b0925ec8363c07b37d09ae1782751e5d9fc37 not found: ID does not exist" containerID="59ec9df9fd1874483d5b297dfb0b0925ec8363c07b37d09ae1782751e5d9fc37" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.965392 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ec9df9fd1874483d5b297dfb0b0925ec8363c07b37d09ae1782751e5d9fc37"} err="failed to get container status \"59ec9df9fd1874483d5b297dfb0b0925ec8363c07b37d09ae1782751e5d9fc37\": rpc error: code = NotFound desc = could not find container \"59ec9df9fd1874483d5b297dfb0b0925ec8363c07b37d09ae1782751e5d9fc37\": container with ID starting with 59ec9df9fd1874483d5b297dfb0b0925ec8363c07b37d09ae1782751e5d9fc37 not found: ID does not exist" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.965444 4913 scope.go:117] "RemoveContainer" containerID="eb46af49a210dc34c36f5f2a0ca5ab57aaf2ea1027d59e67c8f38d9dc1d63a23" Oct 01 12:55:03 crc kubenswrapper[4913]: E1001 12:55:03.965725 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb46af49a210dc34c36f5f2a0ca5ab57aaf2ea1027d59e67c8f38d9dc1d63a23\": container with ID starting with eb46af49a210dc34c36f5f2a0ca5ab57aaf2ea1027d59e67c8f38d9dc1d63a23 not found: ID does not exist" containerID="eb46af49a210dc34c36f5f2a0ca5ab57aaf2ea1027d59e67c8f38d9dc1d63a23" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.965781 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb46af49a210dc34c36f5f2a0ca5ab57aaf2ea1027d59e67c8f38d9dc1d63a23"} err="failed to get container status \"eb46af49a210dc34c36f5f2a0ca5ab57aaf2ea1027d59e67c8f38d9dc1d63a23\": rpc error: code = NotFound desc = could not find container \"eb46af49a210dc34c36f5f2a0ca5ab57aaf2ea1027d59e67c8f38d9dc1d63a23\": container with ID starting with eb46af49a210dc34c36f5f2a0ca5ab57aaf2ea1027d59e67c8f38d9dc1d63a23 not found: ID does not exist" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.992222 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-dd4859979-qdfbk"] Oct 01 12:55:03 crc kubenswrapper[4913]: E1001 12:55:03.992688 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c044c3c-7de5-45ec-b450-cbfaf0ca415f" containerName="horizon" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.992711 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c044c3c-7de5-45ec-b450-cbfaf0ca415f" containerName="horizon" Oct 01 12:55:03 crc kubenswrapper[4913]: E1001 12:55:03.992721 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706c5fb0-a691-4f92-bb4e-a6ba720abfa1" containerName="cinder-db-sync" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.992729 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="706c5fb0-a691-4f92-bb4e-a6ba720abfa1" containerName="cinder-db-sync" Oct 01 12:55:03 crc kubenswrapper[4913]: E1001 12:55:03.992747 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f1ef1c4-7a72-4569-b21c-ef13cb766d25" containerName="barbican-db-sync" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.992755 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f1ef1c4-7a72-4569-b21c-ef13cb766d25" containerName="barbican-db-sync" Oct 01 12:55:03 crc kubenswrapper[4913]: E1001 12:55:03.992770 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c044c3c-7de5-45ec-b450-cbfaf0ca415f" containerName="horizon-log" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.992777 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c044c3c-7de5-45ec-b450-cbfaf0ca415f" containerName="horizon-log" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.992972 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="706c5fb0-a691-4f92-bb4e-a6ba720abfa1" containerName="cinder-db-sync" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.992996 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f1ef1c4-7a72-4569-b21c-ef13cb766d25" containerName="barbican-db-sync" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.993022 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c044c3c-7de5-45ec-b450-cbfaf0ca415f" containerName="horizon-log" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.993039 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c044c3c-7de5-45ec-b450-cbfaf0ca415f" containerName="horizon" Oct 01 12:55:03 crc kubenswrapper[4913]: I1001 12:55:03.993962 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.005924 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.006171 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-sl6r4" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.006414 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.018857 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-dd4859979-qdfbk"] Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.071606 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.072947 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.075939 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.076189 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.076340 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kgnr4" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.076404 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.087987 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-584b46787d-q4kkd"] Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.089432 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.098623 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.120756 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-584b46787d-q4kkd"] Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.130067 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.141371 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7dbf568d79-xqzxz"] Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.142822 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.157378 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dbf568d79-xqzxz"] Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.183140 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee5a8249-034c-4e5a-b562-f98954678ca0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.183209 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-scripts\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.183252 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7906ec7-7151-42d3-a66f-8f269a3bf03f-logs\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.183290 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg7sv\" (UniqueName: \"kubernetes.io/projected/ee5a8249-034c-4e5a-b562-f98954678ca0-kube-api-access-mg7sv\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.183412 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.183449 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.183497 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7906ec7-7151-42d3-a66f-8f269a3bf03f-config-data\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.183594 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-config-data\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.183706 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjw5r\" (UniqueName: \"kubernetes.io/projected/d7906ec7-7151-42d3-a66f-8f269a3bf03f-kube-api-access-jjw5r\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.183781 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7906ec7-7151-42d3-a66f-8f269a3bf03f-config-data-custom\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.183840 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7906ec7-7151-42d3-a66f-8f269a3bf03f-combined-ca-bundle\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.240318 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-778944cb96-tm27q"] Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.242068 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.255671 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.256364 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dbf568d79-xqzxz"] Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.269726 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cf4dd4865-c4gnr"] Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.271197 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: E1001 12:55:04.276737 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-4xb2v ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" podUID="24d93835-d105-428c-8aa1-9f07fea8d97d" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.281629 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-778944cb96-tm27q"] Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.285902 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfdhw\" (UniqueName: \"kubernetes.io/projected/06b8d5ee-e00b-4c23-8fbc-c817160bac72-kube-api-access-jfdhw\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.285972 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee5a8249-034c-4e5a-b562-f98954678ca0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286036 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-ovsdbserver-sb\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286059 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-scripts\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286100 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7906ec7-7151-42d3-a66f-8f269a3bf03f-logs\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286123 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg7sv\" (UniqueName: \"kubernetes.io/projected/ee5a8249-034c-4e5a-b562-f98954678ca0-kube-api-access-mg7sv\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286145 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286179 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b8d5ee-e00b-4c23-8fbc-c817160bac72-combined-ca-bundle\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286203 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286226 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06b8d5ee-e00b-4c23-8fbc-c817160bac72-config-data-custom\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286252 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7906ec7-7151-42d3-a66f-8f269a3bf03f-config-data\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286302 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-config-data\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286347 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjw5r\" (UniqueName: \"kubernetes.io/projected/d7906ec7-7151-42d3-a66f-8f269a3bf03f-kube-api-access-jjw5r\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286370 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-config\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286395 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-ovsdbserver-nb\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286429 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b8d5ee-e00b-4c23-8fbc-c817160bac72-config-data\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286451 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7906ec7-7151-42d3-a66f-8f269a3bf03f-config-data-custom\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286478 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xb2v\" (UniqueName: \"kubernetes.io/projected/24d93835-d105-428c-8aa1-9f07fea8d97d-kube-api-access-4xb2v\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286502 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-dns-svc\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286526 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7906ec7-7151-42d3-a66f-8f269a3bf03f-combined-ca-bundle\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286568 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06b8d5ee-e00b-4c23-8fbc-c817160bac72-logs\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.286677 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee5a8249-034c-4e5a-b562-f98954678ca0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.290667 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7906ec7-7151-42d3-a66f-8f269a3bf03f-logs\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.306919 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.307506 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-config-data\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.308959 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7906ec7-7151-42d3-a66f-8f269a3bf03f-config-data\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.309512 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.310172 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7906ec7-7151-42d3-a66f-8f269a3bf03f-combined-ca-bundle\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.310381 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-scripts\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.313763 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7906ec7-7151-42d3-a66f-8f269a3bf03f-config-data-custom\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.326219 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg7sv\" (UniqueName: \"kubernetes.io/projected/ee5a8249-034c-4e5a-b562-f98954678ca0-kube-api-access-mg7sv\") pod \"cinder-scheduler-0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.332375 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cf4dd4865-c4gnr"] Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.335906 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjw5r\" (UniqueName: \"kubernetes.io/projected/d7906ec7-7151-42d3-a66f-8f269a3bf03f-kube-api-access-jjw5r\") pod \"barbican-worker-dd4859979-qdfbk\" (UID: \"d7906ec7-7151-42d3-a66f-8f269a3bf03f\") " pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388305 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe92d074-cafa-4a38-a8a3-f49068a3366d-logs\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388370 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-combined-ca-bundle\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388400 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-config-data-custom\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388438 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qdbw\" (UniqueName: \"kubernetes.io/projected/fe92d074-cafa-4a38-a8a3-f49068a3366d-kube-api-access-6qdbw\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388471 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-ovsdbserver-sb\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388503 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dj27\" (UniqueName: \"kubernetes.io/projected/7abd6773-6041-4cb9-be41-a6438149974e-kube-api-access-8dj27\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388524 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-ovsdbserver-nb\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388564 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b8d5ee-e00b-4c23-8fbc-c817160bac72-combined-ca-bundle\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388590 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06b8d5ee-e00b-4c23-8fbc-c817160bac72-config-data-custom\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388623 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-config-data\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388673 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-ovsdbserver-sb\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388699 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-config\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388723 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-ovsdbserver-nb\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388757 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b8d5ee-e00b-4c23-8fbc-c817160bac72-config-data\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388782 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-dns-svc\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388809 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xb2v\" (UniqueName: \"kubernetes.io/projected/24d93835-d105-428c-8aa1-9f07fea8d97d-kube-api-access-4xb2v\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388830 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-dns-svc\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388873 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06b8d5ee-e00b-4c23-8fbc-c817160bac72-logs\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388904 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-config\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.388931 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfdhw\" (UniqueName: \"kubernetes.io/projected/06b8d5ee-e00b-4c23-8fbc-c817160bac72-kube-api-access-jfdhw\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.390153 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-ovsdbserver-sb\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.395648 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-dns-svc\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.396617 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-ovsdbserver-nb\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.397817 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06b8d5ee-e00b-4c23-8fbc-c817160bac72-logs\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.402566 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b8d5ee-e00b-4c23-8fbc-c817160bac72-combined-ca-bundle\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.404961 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-config\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.405891 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b8d5ee-e00b-4c23-8fbc-c817160bac72-config-data\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.413085 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06b8d5ee-e00b-4c23-8fbc-c817160bac72-config-data-custom\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.421825 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.423532 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.424725 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xb2v\" (UniqueName: \"kubernetes.io/projected/24d93835-d105-428c-8aa1-9f07fea8d97d-kube-api-access-4xb2v\") pod \"dnsmasq-dns-7dbf568d79-xqzxz\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.424732 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfdhw\" (UniqueName: \"kubernetes.io/projected/06b8d5ee-e00b-4c23-8fbc-c817160bac72-kube-api-access-jfdhw\") pod \"barbican-keystone-listener-584b46787d-q4kkd\" (UID: \"06b8d5ee-e00b-4c23-8fbc-c817160bac72\") " pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.431634 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.432150 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.435133 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.439070 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.491724 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dj27\" (UniqueName: \"kubernetes.io/projected/7abd6773-6041-4cb9-be41-a6438149974e-kube-api-access-8dj27\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.491781 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-ovsdbserver-nb\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.491837 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-config-data\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.491921 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-ovsdbserver-sb\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.491953 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-dns-svc\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.491996 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-config\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.492016 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe92d074-cafa-4a38-a8a3-f49068a3366d-logs\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.492394 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-combined-ca-bundle\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.492420 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-config-data-custom\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.492447 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qdbw\" (UniqueName: \"kubernetes.io/projected/fe92d074-cafa-4a38-a8a3-f49068a3366d-kube-api-access-6qdbw\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.500596 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-ovsdbserver-nb\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.501900 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-config\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.502784 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe92d074-cafa-4a38-a8a3-f49068a3366d-logs\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.507105 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-combined-ca-bundle\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.509523 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-ovsdbserver-sb\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.510540 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-dns-svc\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.511549 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-config-data\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.515023 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-config-data-custom\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.517507 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dj27\" (UniqueName: \"kubernetes.io/projected/7abd6773-6041-4cb9-be41-a6438149974e-kube-api-access-8dj27\") pod \"dnsmasq-dns-6cf4dd4865-c4gnr\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.524889 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qdbw\" (UniqueName: \"kubernetes.io/projected/fe92d074-cafa-4a38-a8a3-f49068a3366d-kube-api-access-6qdbw\") pod \"barbican-api-778944cb96-tm27q\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.562721 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.593574 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.596196 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-scripts\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.596255 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.596417 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-config-data\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.596435 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.596462 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.596494 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-logs\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.596549 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28hnq\" (UniqueName: \"kubernetes.io/projected/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-kube-api-access-28hnq\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.636005 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-dd4859979-qdfbk" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.708422 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28hnq\" (UniqueName: \"kubernetes.io/projected/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-kube-api-access-28hnq\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.708863 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-scripts\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.708936 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.709013 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-config-data\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.709050 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.709114 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.709170 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-logs\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.710611 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-logs\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.710658 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.715879 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.715937 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-scripts\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.715963 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.723456 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-config-data\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.728576 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28hnq\" (UniqueName: \"kubernetes.io/projected/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-kube-api-access-28hnq\") pod \"cinder-api-0\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " pod="openstack/cinder-api-0" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.772921 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.789430 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.838788 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c044c3c-7de5-45ec-b450-cbfaf0ca415f" path="/var/lib/kubelet/pods/3c044c3c-7de5-45ec-b450-cbfaf0ca415f/volumes" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.911385 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-config\") pod \"24d93835-d105-428c-8aa1-9f07fea8d97d\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.911454 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-ovsdbserver-nb\") pod \"24d93835-d105-428c-8aa1-9f07fea8d97d\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.911497 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xb2v\" (UniqueName: \"kubernetes.io/projected/24d93835-d105-428c-8aa1-9f07fea8d97d-kube-api-access-4xb2v\") pod \"24d93835-d105-428c-8aa1-9f07fea8d97d\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.911632 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-ovsdbserver-sb\") pod \"24d93835-d105-428c-8aa1-9f07fea8d97d\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.911675 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-dns-svc\") pod \"24d93835-d105-428c-8aa1-9f07fea8d97d\" (UID: \"24d93835-d105-428c-8aa1-9f07fea8d97d\") " Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.912974 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "24d93835-d105-428c-8aa1-9f07fea8d97d" (UID: "24d93835-d105-428c-8aa1-9f07fea8d97d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.913487 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "24d93835-d105-428c-8aa1-9f07fea8d97d" (UID: "24d93835-d105-428c-8aa1-9f07fea8d97d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.913549 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "24d93835-d105-428c-8aa1-9f07fea8d97d" (UID: "24d93835-d105-428c-8aa1-9f07fea8d97d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.913704 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-config" (OuterVolumeSpecName: "config") pod "24d93835-d105-428c-8aa1-9f07fea8d97d" (UID: "24d93835-d105-428c-8aa1-9f07fea8d97d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.917767 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24d93835-d105-428c-8aa1-9f07fea8d97d-kube-api-access-4xb2v" (OuterVolumeSpecName: "kube-api-access-4xb2v") pod "24d93835-d105-428c-8aa1-9f07fea8d97d" (UID: "24d93835-d105-428c-8aa1-9f07fea8d97d"). InnerVolumeSpecName "kube-api-access-4xb2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:04 crc kubenswrapper[4913]: I1001 12:55:04.927525 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.005967 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:55:05 crc kubenswrapper[4913]: W1001 12:55:05.006886 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee5a8249_034c_4e5a_b562_f98954678ca0.slice/crio-8908d22b986ef4aca28e97d222d3ce6ed36533f0fa93ebf240ff9714e9e9feb3 WatchSource:0}: Error finding container 8908d22b986ef4aca28e97d222d3ce6ed36533f0fa93ebf240ff9714e9e9feb3: Status 404 returned error can't find the container with id 8908d22b986ef4aca28e97d222d3ce6ed36533f0fa93ebf240ff9714e9e9feb3 Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.014473 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.014506 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.014515 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.014524 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24d93835-d105-428c-8aa1-9f07fea8d97d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.014532 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xb2v\" (UniqueName: \"kubernetes.io/projected/24d93835-d105-428c-8aa1-9f07fea8d97d-kube-api-access-4xb2v\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.118168 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-584b46787d-q4kkd"] Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.207896 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-g8j5s" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.211970 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hkhhg" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.236522 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k949\" (UniqueName: \"kubernetes.io/projected/3fa06e3e-e04d-481f-87e0-a55d168994f7-kube-api-access-2k949\") pod \"3fa06e3e-e04d-481f-87e0-a55d168994f7\" (UID: \"3fa06e3e-e04d-481f-87e0-a55d168994f7\") " Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.236665 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s25fb\" (UniqueName: \"kubernetes.io/projected/4983dc8d-2950-45be-9bd3-33f5e24d52ef-kube-api-access-s25fb\") pod \"4983dc8d-2950-45be-9bd3-33f5e24d52ef\" (UID: \"4983dc8d-2950-45be-9bd3-33f5e24d52ef\") " Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.243302 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fa06e3e-e04d-481f-87e0-a55d168994f7-kube-api-access-2k949" (OuterVolumeSpecName: "kube-api-access-2k949") pod "3fa06e3e-e04d-481f-87e0-a55d168994f7" (UID: "3fa06e3e-e04d-481f-87e0-a55d168994f7"). InnerVolumeSpecName "kube-api-access-2k949". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.243421 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4983dc8d-2950-45be-9bd3-33f5e24d52ef-kube-api-access-s25fb" (OuterVolumeSpecName: "kube-api-access-s25fb") pod "4983dc8d-2950-45be-9bd3-33f5e24d52ef" (UID: "4983dc8d-2950-45be-9bd3-33f5e24d52ef"). InnerVolumeSpecName "kube-api-access-s25fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.340552 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s25fb\" (UniqueName: \"kubernetes.io/projected/4983dc8d-2950-45be-9bd3-33f5e24d52ef-kube-api-access-s25fb\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.340600 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k949\" (UniqueName: \"kubernetes.io/projected/3fa06e3e-e04d-481f-87e0-a55d168994f7-kube-api-access-2k949\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.392332 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-778944cb96-tm27q"] Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.396525 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bf4s9" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.442641 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42mrg\" (UniqueName: \"kubernetes.io/projected/82163264-a7e8-4183-b162-8ddabbce7f39-kube-api-access-42mrg\") pod \"82163264-a7e8-4183-b162-8ddabbce7f39\" (UID: \"82163264-a7e8-4183-b162-8ddabbce7f39\") " Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.456245 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82163264-a7e8-4183-b162-8ddabbce7f39-kube-api-access-42mrg" (OuterVolumeSpecName: "kube-api-access-42mrg") pod "82163264-a7e8-4183-b162-8ddabbce7f39" (UID: "82163264-a7e8-4183-b162-8ddabbce7f39"). InnerVolumeSpecName "kube-api-access-42mrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.543680 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42mrg\" (UniqueName: \"kubernetes.io/projected/82163264-a7e8-4183-b162-8ddabbce7f39-kube-api-access-42mrg\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.578135 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-dd4859979-qdfbk"] Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.598473 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cf4dd4865-c4gnr"] Oct 01 12:55:05 crc kubenswrapper[4913]: W1001 12:55:05.614113 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7abd6773_6041_4cb9_be41_a6438149974e.slice/crio-2ab4b44eed9de4051ea315cbe6ce5b19bdce469d19d610c17476dc245753a74c WatchSource:0}: Error finding container 2ab4b44eed9de4051ea315cbe6ce5b19bdce469d19d610c17476dc245753a74c: Status 404 returned error can't find the container with id 2ab4b44eed9de4051ea315cbe6ce5b19bdce469d19d610c17476dc245753a74c Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.664984 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.784289 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" event={"ID":"06b8d5ee-e00b-4c23-8fbc-c817160bac72","Type":"ContainerStarted","Data":"6c341d5c4336384f0a45277c9b580f795f13f25dfeded7ec5e9e080fe03d0681"} Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.786045 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-g8j5s" event={"ID":"3fa06e3e-e04d-481f-87e0-a55d168994f7","Type":"ContainerDied","Data":"49dab133699a55484476bd0375d6fac02a87614a397326fec1069401b3a7c2f1"} Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.786065 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49dab133699a55484476bd0375d6fac02a87614a397326fec1069401b3a7c2f1" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.786113 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-g8j5s" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.804511 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfd22053-f769-4cc8-94f9-6a0fbb0fde97","Type":"ContainerStarted","Data":"f8c07f3032486136c146dbffe4712310ceff1a995a87fab65c0c82a0cc61577d"} Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.812729 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-778944cb96-tm27q" event={"ID":"fe92d074-cafa-4a38-a8a3-f49068a3366d","Type":"ContainerStarted","Data":"9c0e67c0b49a361639069473d3506f3057f6038adb8989064f9c6907092d999d"} Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.812788 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-778944cb96-tm27q" event={"ID":"fe92d074-cafa-4a38-a8a3-f49068a3366d","Type":"ContainerStarted","Data":"a17f21edf05c5dde966137a9382c34d6ae3901237c4fca42bb0308ea2fddc496"} Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.818090 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bf4s9" event={"ID":"82163264-a7e8-4183-b162-8ddabbce7f39","Type":"ContainerDied","Data":"386cacb7893914b1dc198252c3dce66412f70efcabf7182c6f41a6131f7130c2"} Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.818150 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="386cacb7893914b1dc198252c3dce66412f70efcabf7182c6f41a6131f7130c2" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.818108 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bf4s9" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.830190 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" event={"ID":"7abd6773-6041-4cb9-be41-a6438149974e","Type":"ContainerStarted","Data":"2ab4b44eed9de4051ea315cbe6ce5b19bdce469d19d610c17476dc245753a74c"} Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.841947 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hkhhg" event={"ID":"4983dc8d-2950-45be-9bd3-33f5e24d52ef","Type":"ContainerDied","Data":"53adec0d08cc75b0a93432ae8cb829b9b927338e3379aac83e7dd8a89e2e2f0b"} Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.841980 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53adec0d08cc75b0a93432ae8cb829b9b927338e3379aac83e7dd8a89e2e2f0b" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.842028 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hkhhg" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.846779 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ee5a8249-034c-4e5a-b562-f98954678ca0","Type":"ContainerStarted","Data":"8908d22b986ef4aca28e97d222d3ce6ed36533f0fa93ebf240ff9714e9e9feb3"} Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.848669 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dbf568d79-xqzxz" Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.851015 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-dd4859979-qdfbk" event={"ID":"d7906ec7-7151-42d3-a66f-8f269a3bf03f","Type":"ContainerStarted","Data":"d9469a40130884ed288a099490cdda53d74c146d11ca5a33491b2b4eaf6d5053"} Oct 01 12:55:05 crc kubenswrapper[4913]: I1001 12:55:05.999317 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dbf568d79-xqzxz"] Oct 01 12:55:06 crc kubenswrapper[4913]: I1001 12:55:06.038685 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7dbf568d79-xqzxz"] Oct 01 12:55:06 crc kubenswrapper[4913]: I1001 12:55:06.156083 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:55:06 crc kubenswrapper[4913]: I1001 12:55:06.826197 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24d93835-d105-428c-8aa1-9f07fea8d97d" path="/var/lib/kubelet/pods/24d93835-d105-428c-8aa1-9f07fea8d97d/volumes" Oct 01 12:55:06 crc kubenswrapper[4913]: I1001 12:55:06.867988 4913 generic.go:334] "Generic (PLEG): container finished" podID="57296118-560c-4764-b94a-472d8467f7c0" containerID="6e2d93f6b1e95834d602362606c94652fbad38050142bcb478336afa63092c4e" exitCode=0 Oct 01 12:55:06 crc kubenswrapper[4913]: I1001 12:55:06.868030 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b6l67" event={"ID":"57296118-560c-4764-b94a-472d8467f7c0","Type":"ContainerDied","Data":"6e2d93f6b1e95834d602362606c94652fbad38050142bcb478336afa63092c4e"} Oct 01 12:55:06 crc kubenswrapper[4913]: I1001 12:55:06.870795 4913 generic.go:334] "Generic (PLEG): container finished" podID="7abd6773-6041-4cb9-be41-a6438149974e" containerID="0c54a02909f848e0e63933350708bd446559614e113e81bb7623ee1a61335f7b" exitCode=0 Oct 01 12:55:06 crc kubenswrapper[4913]: I1001 12:55:06.870858 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" event={"ID":"7abd6773-6041-4cb9-be41-a6438149974e","Type":"ContainerDied","Data":"0c54a02909f848e0e63933350708bd446559614e113e81bb7623ee1a61335f7b"} Oct 01 12:55:06 crc kubenswrapper[4913]: I1001 12:55:06.872891 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfd22053-f769-4cc8-94f9-6a0fbb0fde97","Type":"ContainerStarted","Data":"61d9313934bcd942f1bcfc575a0ba414020643feffc316ac190c2da91115d5f3"} Oct 01 12:55:06 crc kubenswrapper[4913]: I1001 12:55:06.874752 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-778944cb96-tm27q" event={"ID":"fe92d074-cafa-4a38-a8a3-f49068a3366d","Type":"ContainerStarted","Data":"dad074d9b09f0dcf53d2ce6062cc6b470772e246bdb6bf5e4e2e77f88c5e4ff6"} Oct 01 12:55:06 crc kubenswrapper[4913]: I1001 12:55:06.875059 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:06 crc kubenswrapper[4913]: I1001 12:55:06.875135 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:06 crc kubenswrapper[4913]: I1001 12:55:06.935603 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-778944cb96-tm27q" podStartSLOduration=2.935582279 podStartE2EDuration="2.935582279s" podCreationTimestamp="2025-10-01 12:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:06.930237692 +0000 UTC m=+1038.833713300" watchObservedRunningTime="2025-10-01 12:55:06.935582279 +0000 UTC m=+1038.839057857" Oct 01 12:55:07 crc kubenswrapper[4913]: I1001 12:55:07.884030 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ee5a8249-034c-4e5a-b562-f98954678ca0","Type":"ContainerStarted","Data":"0e2f75a442ef6de425ea9860fbc1e62337133deef64db4560b0a155b3209d132"} Oct 01 12:55:07 crc kubenswrapper[4913]: I1001 12:55:07.886443 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfd22053-f769-4cc8-94f9-6a0fbb0fde97","Type":"ContainerStarted","Data":"de67833c00652503ac5ab566254eead3267f08c98e550a3df2f2d9bc88798a66"} Oct 01 12:55:07 crc kubenswrapper[4913]: I1001 12:55:07.886591 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bfd22053-f769-4cc8-94f9-6a0fbb0fde97" containerName="cinder-api-log" containerID="cri-o://61d9313934bcd942f1bcfc575a0ba414020643feffc316ac190c2da91115d5f3" gracePeriod=30 Oct 01 12:55:07 crc kubenswrapper[4913]: I1001 12:55:07.886641 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bfd22053-f769-4cc8-94f9-6a0fbb0fde97" containerName="cinder-api" containerID="cri-o://de67833c00652503ac5ab566254eead3267f08c98e550a3df2f2d9bc88798a66" gracePeriod=30 Oct 01 12:55:07 crc kubenswrapper[4913]: I1001 12:55:07.886725 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 12:55:07 crc kubenswrapper[4913]: I1001 12:55:07.912028 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.911983918 podStartE2EDuration="3.911983918s" podCreationTimestamp="2025-10-01 12:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:07.903017702 +0000 UTC m=+1039.806493320" watchObservedRunningTime="2025-10-01 12:55:07.911983918 +0000 UTC m=+1039.815459496" Oct 01 12:55:08 crc kubenswrapper[4913]: I1001 12:55:08.134900 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:55:08 crc kubenswrapper[4913]: I1001 12:55:08.187524 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-66d867dfb6-r9zrq" Oct 01 12:55:08 crc kubenswrapper[4913]: I1001 12:55:08.912593 4913 generic.go:334] "Generic (PLEG): container finished" podID="bfd22053-f769-4cc8-94f9-6a0fbb0fde97" containerID="de67833c00652503ac5ab566254eead3267f08c98e550a3df2f2d9bc88798a66" exitCode=0 Oct 01 12:55:08 crc kubenswrapper[4913]: I1001 12:55:08.912985 4913 generic.go:334] "Generic (PLEG): container finished" podID="bfd22053-f769-4cc8-94f9-6a0fbb0fde97" containerID="61d9313934bcd942f1bcfc575a0ba414020643feffc316ac190c2da91115d5f3" exitCode=143 Oct 01 12:55:08 crc kubenswrapper[4913]: I1001 12:55:08.913957 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfd22053-f769-4cc8-94f9-6a0fbb0fde97","Type":"ContainerDied","Data":"de67833c00652503ac5ab566254eead3267f08c98e550a3df2f2d9bc88798a66"} Oct 01 12:55:08 crc kubenswrapper[4913]: I1001 12:55:08.913987 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfd22053-f769-4cc8-94f9-6a0fbb0fde97","Type":"ContainerDied","Data":"61d9313934bcd942f1bcfc575a0ba414020643feffc316ac190c2da91115d5f3"} Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.310472 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b6l67" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.361007 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-config-data\") pod \"57296118-560c-4764-b94a-472d8467f7c0\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.361089 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-combined-ca-bundle\") pod \"57296118-560c-4764-b94a-472d8467f7c0\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.361139 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2qlb\" (UniqueName: \"kubernetes.io/projected/57296118-560c-4764-b94a-472d8467f7c0-kube-api-access-s2qlb\") pod \"57296118-560c-4764-b94a-472d8467f7c0\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.361169 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-db-sync-config-data\") pod \"57296118-560c-4764-b94a-472d8467f7c0\" (UID: \"57296118-560c-4764-b94a-472d8467f7c0\") " Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.367679 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57296118-560c-4764-b94a-472d8467f7c0-kube-api-access-s2qlb" (OuterVolumeSpecName: "kube-api-access-s2qlb") pod "57296118-560c-4764-b94a-472d8467f7c0" (UID: "57296118-560c-4764-b94a-472d8467f7c0"). InnerVolumeSpecName "kube-api-access-s2qlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.375920 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "57296118-560c-4764-b94a-472d8467f7c0" (UID: "57296118-560c-4764-b94a-472d8467f7c0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.405895 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57296118-560c-4764-b94a-472d8467f7c0" (UID: "57296118-560c-4764-b94a-472d8467f7c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.447050 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-config-data" (OuterVolumeSpecName: "config-data") pod "57296118-560c-4764-b94a-472d8467f7c0" (UID: "57296118-560c-4764-b94a-472d8467f7c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.465409 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.465440 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.465451 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2qlb\" (UniqueName: \"kubernetes.io/projected/57296118-560c-4764-b94a-472d8467f7c0-kube-api-access-s2qlb\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.465460 4913 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57296118-560c-4764-b94a-472d8467f7c0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.544866 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.667216 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-config-data\") pod \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.667379 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-combined-ca-bundle\") pod \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.667478 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-etc-machine-id\") pod \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.667570 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-scripts\") pod \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.667609 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-logs\") pod \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.667640 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28hnq\" (UniqueName: \"kubernetes.io/projected/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-kube-api-access-28hnq\") pod \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.667692 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-config-data-custom\") pod \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\" (UID: \"bfd22053-f769-4cc8-94f9-6a0fbb0fde97\") " Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.668471 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-logs" (OuterVolumeSpecName: "logs") pod "bfd22053-f769-4cc8-94f9-6a0fbb0fde97" (UID: "bfd22053-f769-4cc8-94f9-6a0fbb0fde97"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.668533 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bfd22053-f769-4cc8-94f9-6a0fbb0fde97" (UID: "bfd22053-f769-4cc8-94f9-6a0fbb0fde97"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.676436 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bfd22053-f769-4cc8-94f9-6a0fbb0fde97" (UID: "bfd22053-f769-4cc8-94f9-6a0fbb0fde97"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.676539 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-scripts" (OuterVolumeSpecName: "scripts") pod "bfd22053-f769-4cc8-94f9-6a0fbb0fde97" (UID: "bfd22053-f769-4cc8-94f9-6a0fbb0fde97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.676643 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-kube-api-access-28hnq" (OuterVolumeSpecName: "kube-api-access-28hnq") pod "bfd22053-f769-4cc8-94f9-6a0fbb0fde97" (UID: "bfd22053-f769-4cc8-94f9-6a0fbb0fde97"). InnerVolumeSpecName "kube-api-access-28hnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.724485 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfd22053-f769-4cc8-94f9-6a0fbb0fde97" (UID: "bfd22053-f769-4cc8-94f9-6a0fbb0fde97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.756687 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-config-data" (OuterVolumeSpecName: "config-data") pod "bfd22053-f769-4cc8-94f9-6a0fbb0fde97" (UID: "bfd22053-f769-4cc8-94f9-6a0fbb0fde97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.769211 4913 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.769249 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.769278 4913 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.769290 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28hnq\" (UniqueName: \"kubernetes.io/projected/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-kube-api-access-28hnq\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.769304 4913 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.769314 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.769326 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd22053-f769-4cc8-94f9-6a0fbb0fde97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.925261 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" event={"ID":"7abd6773-6041-4cb9-be41-a6438149974e","Type":"ContainerStarted","Data":"2c0a0570433d20e62be043a04c93a6ba39baa284a79b20f354500f91d2f4eab1"} Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.925567 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.928064 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b6l67" event={"ID":"57296118-560c-4764-b94a-472d8467f7c0","Type":"ContainerDied","Data":"4635b0c2957a72496e904a97f09d3fdb273ebf452f6c4a8f75efac241057deea"} Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.928091 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4635b0c2957a72496e904a97f09d3fdb273ebf452f6c4a8f75efac241057deea" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.928118 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b6l67" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.929432 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-dd4859979-qdfbk" event={"ID":"d7906ec7-7151-42d3-a66f-8f269a3bf03f","Type":"ContainerStarted","Data":"e055facea4cc175081d69a212ca5ce3f520ba78eeacc4fa469ec28ef683307b6"} Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.929454 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-dd4859979-qdfbk" event={"ID":"d7906ec7-7151-42d3-a66f-8f269a3bf03f","Type":"ContainerStarted","Data":"c573e7c71f858e09dc7f306f2f138c582b879a005c5ac3d639eb549382bf1483"} Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.931683 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" event={"ID":"06b8d5ee-e00b-4c23-8fbc-c817160bac72","Type":"ContainerStarted","Data":"0ad03fa1a7332557a9ebf6e07cccfc7d78b59f87a28f43af5a5eea5589cea028"} Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.931708 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" event={"ID":"06b8d5ee-e00b-4c23-8fbc-c817160bac72","Type":"ContainerStarted","Data":"600f108bfd7c4e47eb8e6445d3084b7ca7698a7e8d324c159b71281a00ba1862"} Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.933557 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfd22053-f769-4cc8-94f9-6a0fbb0fde97","Type":"ContainerDied","Data":"f8c07f3032486136c146dbffe4712310ceff1a995a87fab65c0c82a0cc61577d"} Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.933582 4913 scope.go:117] "RemoveContainer" containerID="de67833c00652503ac5ab566254eead3267f08c98e550a3df2f2d9bc88798a66" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.933679 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.961948 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" podStartSLOduration=5.961931622 podStartE2EDuration="5.961931622s" podCreationTimestamp="2025-10-01 12:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:09.954519778 +0000 UTC m=+1041.857995376" watchObservedRunningTime="2025-10-01 12:55:09.961931622 +0000 UTC m=+1041.865407200" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.990223 4913 scope.go:117] "RemoveContainer" containerID="61d9313934bcd942f1bcfc575a0ba414020643feffc316ac190c2da91115d5f3" Oct 01 12:55:09 crc kubenswrapper[4913]: I1001 12:55:09.991078 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-dd4859979-qdfbk" podStartSLOduration=3.353746767 podStartE2EDuration="6.991056131s" podCreationTimestamp="2025-10-01 12:55:03 +0000 UTC" firstStartedPulling="2025-10-01 12:55:05.594650353 +0000 UTC m=+1037.498125931" lastFinishedPulling="2025-10-01 12:55:09.231959717 +0000 UTC m=+1041.135435295" observedRunningTime="2025-10-01 12:55:09.97753905 +0000 UTC m=+1041.881014648" watchObservedRunningTime="2025-10-01 12:55:09.991056131 +0000 UTC m=+1041.894531709" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.012178 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-584b46787d-q4kkd" podStartSLOduration=1.899361952 podStartE2EDuration="6.012156921s" podCreationTimestamp="2025-10-01 12:55:04 +0000 UTC" firstStartedPulling="2025-10-01 12:55:05.118628943 +0000 UTC m=+1037.022104521" lastFinishedPulling="2025-10-01 12:55:09.231423912 +0000 UTC m=+1041.134899490" observedRunningTime="2025-10-01 12:55:10.004812549 +0000 UTC m=+1041.908288147" watchObservedRunningTime="2025-10-01 12:55:10.012156921 +0000 UTC m=+1041.915632499" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.065751 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.077917 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.084863 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:55:10 crc kubenswrapper[4913]: E1001 12:55:10.085239 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd22053-f769-4cc8-94f9-6a0fbb0fde97" containerName="cinder-api" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.085256 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd22053-f769-4cc8-94f9-6a0fbb0fde97" containerName="cinder-api" Oct 01 12:55:10 crc kubenswrapper[4913]: E1001 12:55:10.085278 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa06e3e-e04d-481f-87e0-a55d168994f7" containerName="mariadb-database-create" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.085285 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa06e3e-e04d-481f-87e0-a55d168994f7" containerName="mariadb-database-create" Oct 01 12:55:10 crc kubenswrapper[4913]: E1001 12:55:10.085297 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd22053-f769-4cc8-94f9-6a0fbb0fde97" containerName="cinder-api-log" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.085303 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd22053-f769-4cc8-94f9-6a0fbb0fde97" containerName="cinder-api-log" Oct 01 12:55:10 crc kubenswrapper[4913]: E1001 12:55:10.085315 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4983dc8d-2950-45be-9bd3-33f5e24d52ef" containerName="mariadb-database-create" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.085321 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="4983dc8d-2950-45be-9bd3-33f5e24d52ef" containerName="mariadb-database-create" Oct 01 12:55:10 crc kubenswrapper[4913]: E1001 12:55:10.085340 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82163264-a7e8-4183-b162-8ddabbce7f39" containerName="mariadb-database-create" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.085345 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="82163264-a7e8-4183-b162-8ddabbce7f39" containerName="mariadb-database-create" Oct 01 12:55:10 crc kubenswrapper[4913]: E1001 12:55:10.085355 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57296118-560c-4764-b94a-472d8467f7c0" containerName="glance-db-sync" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.085361 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="57296118-560c-4764-b94a-472d8467f7c0" containerName="glance-db-sync" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.085511 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="57296118-560c-4764-b94a-472d8467f7c0" containerName="glance-db-sync" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.085524 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fa06e3e-e04d-481f-87e0-a55d168994f7" containerName="mariadb-database-create" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.085538 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd22053-f769-4cc8-94f9-6a0fbb0fde97" containerName="cinder-api-log" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.085547 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd22053-f769-4cc8-94f9-6a0fbb0fde97" containerName="cinder-api" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.085555 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="82163264-a7e8-4183-b162-8ddabbce7f39" containerName="mariadb-database-create" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.085565 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="4983dc8d-2950-45be-9bd3-33f5e24d52ef" containerName="mariadb-database-create" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.086438 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.088821 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.089030 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.090120 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.093100 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.182165 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.182213 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.182257 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-scripts\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.182335 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-config-data-custom\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.182380 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58551fb-1f57-46d7-9209-dc92b0ebb305-logs\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.182514 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-config-data\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.182614 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c58551fb-1f57-46d7-9209-dc92b0ebb305-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.182652 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wl7g\" (UniqueName: \"kubernetes.io/projected/c58551fb-1f57-46d7-9209-dc92b0ebb305-kube-api-access-7wl7g\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.182758 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.284069 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58551fb-1f57-46d7-9209-dc92b0ebb305-logs\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.284109 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-config-data\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.284143 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c58551fb-1f57-46d7-9209-dc92b0ebb305-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.284163 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wl7g\" (UniqueName: \"kubernetes.io/projected/c58551fb-1f57-46d7-9209-dc92b0ebb305-kube-api-access-7wl7g\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.284207 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.284231 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.284255 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.284310 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-scripts\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.284339 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-config-data-custom\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.284467 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c58551fb-1f57-46d7-9209-dc92b0ebb305-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.284853 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58551fb-1f57-46d7-9209-dc92b0ebb305-logs\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.288487 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.289672 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-scripts\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.292891 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.293074 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-config-data\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.306523 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wl7g\" (UniqueName: \"kubernetes.io/projected/c58551fb-1f57-46d7-9209-dc92b0ebb305-kube-api-access-7wl7g\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.307388 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.312811 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c58551fb-1f57-46d7-9209-dc92b0ebb305-config-data-custom\") pod \"cinder-api-0\" (UID: \"c58551fb-1f57-46d7-9209-dc92b0ebb305\") " pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.412528 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.720226 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cf4dd4865-c4gnr"] Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.739184 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-94448955c-ljhxn"] Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.745480 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.751410 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94448955c-ljhxn"] Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.793716 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-config\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.793899 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-ovsdbserver-sb\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.793938 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-ovsdbserver-nb\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.793979 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tw6d\" (UniqueName: \"kubernetes.io/projected/6d0b7295-6064-4486-a41f-ce0a9332ac61-kube-api-access-5tw6d\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.794067 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-dns-svc\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.846913 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd22053-f769-4cc8-94f9-6a0fbb0fde97" path="/var/lib/kubelet/pods/bfd22053-f769-4cc8-94f9-6a0fbb0fde97/volumes" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.899541 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-ovsdbserver-sb\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.899604 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-ovsdbserver-nb\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.899814 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tw6d\" (UniqueName: \"kubernetes.io/projected/6d0b7295-6064-4486-a41f-ce0a9332ac61-kube-api-access-5tw6d\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.899917 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-dns-svc\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.900233 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-config\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.901338 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-ovsdbserver-sb\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.901892 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-ovsdbserver-nb\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.902442 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-config\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.903457 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-dns-svc\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.924715 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tw6d\" (UniqueName: \"kubernetes.io/projected/6d0b7295-6064-4486-a41f-ce0a9332ac61-kube-api-access-5tw6d\") pod \"dnsmasq-dns-94448955c-ljhxn\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.956845 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ee5a8249-034c-4e5a-b562-f98954678ca0","Type":"ContainerStarted","Data":"2111baf66902bee324e8b60fa58e5afac34f2186318babe9a339e7b5505f7e46"} Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.961202 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:55:10 crc kubenswrapper[4913]: I1001 12:55:10.990227 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.929942871 podStartE2EDuration="6.990201404s" podCreationTimestamp="2025-10-01 12:55:04 +0000 UTC" firstStartedPulling="2025-10-01 12:55:05.009222616 +0000 UTC m=+1036.912698194" lastFinishedPulling="2025-10-01 12:55:06.069481149 +0000 UTC m=+1037.972956727" observedRunningTime="2025-10-01 12:55:10.980564569 +0000 UTC m=+1042.884040177" watchObservedRunningTime="2025-10-01 12:55:10.990201404 +0000 UTC m=+1042.893676982" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.082731 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.166022 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55679b7754-wczpf"] Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.167310 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.179778 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.180035 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.181236 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55679b7754-wczpf"] Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.213515 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27ddr\" (UniqueName: \"kubernetes.io/projected/42099d8f-bc53-4134-9351-cbbc06da162e-kube-api-access-27ddr\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.213574 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-internal-tls-certs\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.213606 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42099d8f-bc53-4134-9351-cbbc06da162e-logs\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.213628 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-combined-ca-bundle\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.213663 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-config-data-custom\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.213693 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-config-data\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.213751 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-public-tls-certs\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.218633 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-70c3-account-create-9wk5c"] Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.219744 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-70c3-account-create-9wk5c" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.222926 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.231870 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-70c3-account-create-9wk5c"] Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.318333 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-public-tls-certs\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.318381 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27ddr\" (UniqueName: \"kubernetes.io/projected/42099d8f-bc53-4134-9351-cbbc06da162e-kube-api-access-27ddr\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.318421 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-internal-tls-certs\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.318453 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42099d8f-bc53-4134-9351-cbbc06da162e-logs\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.318474 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-combined-ca-bundle\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.318509 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-config-data-custom\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.318540 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-config-data\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.318569 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp4h6\" (UniqueName: \"kubernetes.io/projected/c2e6e788-d705-40e2-b19e-23d915ccc7cd-kube-api-access-jp4h6\") pod \"nova-cell0-70c3-account-create-9wk5c\" (UID: \"c2e6e788-d705-40e2-b19e-23d915ccc7cd\") " pod="openstack/nova-cell0-70c3-account-create-9wk5c" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.319719 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42099d8f-bc53-4134-9351-cbbc06da162e-logs\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.342744 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-config-data-custom\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.345132 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-public-tls-certs\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.346100 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-config-data\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.348778 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27ddr\" (UniqueName: \"kubernetes.io/projected/42099d8f-bc53-4134-9351-cbbc06da162e-kube-api-access-27ddr\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.359324 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-combined-ca-bundle\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.372931 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42099d8f-bc53-4134-9351-cbbc06da162e-internal-tls-certs\") pod \"barbican-api-55679b7754-wczpf\" (UID: \"42099d8f-bc53-4134-9351-cbbc06da162e\") " pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.427870 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-55b9-account-create-v4tbp"] Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.428296 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp4h6\" (UniqueName: \"kubernetes.io/projected/c2e6e788-d705-40e2-b19e-23d915ccc7cd-kube-api-access-jp4h6\") pod \"nova-cell0-70c3-account-create-9wk5c\" (UID: \"c2e6e788-d705-40e2-b19e-23d915ccc7cd\") " pod="openstack/nova-cell0-70c3-account-create-9wk5c" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.429113 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-55b9-account-create-v4tbp" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.433188 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.471346 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-55b9-account-create-v4tbp"] Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.489089 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp4h6\" (UniqueName: \"kubernetes.io/projected/c2e6e788-d705-40e2-b19e-23d915ccc7cd-kube-api-access-jp4h6\") pod \"nova-cell0-70c3-account-create-9wk5c\" (UID: \"c2e6e788-d705-40e2-b19e-23d915ccc7cd\") " pod="openstack/nova-cell0-70c3-account-create-9wk5c" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.530042 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtp6k\" (UniqueName: \"kubernetes.io/projected/fd92064d-f7ca-4b7c-8596-ccc759c048ad-kube-api-access-qtp6k\") pod \"nova-cell1-55b9-account-create-v4tbp\" (UID: \"fd92064d-f7ca-4b7c-8596-ccc759c048ad\") " pod="openstack/nova-cell1-55b9-account-create-v4tbp" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.542542 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.579930 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-70c3-account-create-9wk5c" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.633094 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtp6k\" (UniqueName: \"kubernetes.io/projected/fd92064d-f7ca-4b7c-8596-ccc759c048ad-kube-api-access-qtp6k\") pod \"nova-cell1-55b9-account-create-v4tbp\" (UID: \"fd92064d-f7ca-4b7c-8596-ccc759c048ad\") " pod="openstack/nova-cell1-55b9-account-create-v4tbp" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.651580 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtp6k\" (UniqueName: \"kubernetes.io/projected/fd92064d-f7ca-4b7c-8596-ccc759c048ad-kube-api-access-qtp6k\") pod \"nova-cell1-55b9-account-create-v4tbp\" (UID: \"fd92064d-f7ca-4b7c-8596-ccc759c048ad\") " pod="openstack/nova-cell1-55b9-account-create-v4tbp" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.658011 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94448955c-ljhxn"] Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.843488 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-55b9-account-create-v4tbp" Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.991355 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c58551fb-1f57-46d7-9209-dc92b0ebb305","Type":"ContainerStarted","Data":"3cd00a7761e5b1e834e18ebda67cbbcc28a521146a9735728ea9c615bc68ad8b"} Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.991649 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c58551fb-1f57-46d7-9209-dc92b0ebb305","Type":"ContainerStarted","Data":"47689489f16c4f7f204379e7b3a7d5a7094ef9c13e3e794586f968e819c887cc"} Oct 01 12:55:11 crc kubenswrapper[4913]: I1001 12:55:11.998289 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94448955c-ljhxn" event={"ID":"6d0b7295-6064-4486-a41f-ce0a9332ac61","Type":"ContainerStarted","Data":"d246ecb4a6011f640fa33cd84bad38929461117825b5f9e935565b144fd7d596"} Oct 01 12:55:12 crc kubenswrapper[4913]: I1001 12:55:11.998598 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" podUID="7abd6773-6041-4cb9-be41-a6438149974e" containerName="dnsmasq-dns" containerID="cri-o://2c0a0570433d20e62be043a04c93a6ba39baa284a79b20f354500f91d2f4eab1" gracePeriod=10 Oct 01 12:55:12 crc kubenswrapper[4913]: I1001 12:55:12.138473 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55679b7754-wczpf"] Oct 01 12:55:12 crc kubenswrapper[4913]: I1001 12:55:12.147462 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-70c3-account-create-9wk5c"] Oct 01 12:55:12 crc kubenswrapper[4913]: I1001 12:55:12.432765 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-55b9-account-create-v4tbp"] Oct 01 12:55:12 crc kubenswrapper[4913]: I1001 12:55:12.795896 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:12 crc kubenswrapper[4913]: I1001 12:55:12.876240 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dj27\" (UniqueName: \"kubernetes.io/projected/7abd6773-6041-4cb9-be41-a6438149974e-kube-api-access-8dj27\") pod \"7abd6773-6041-4cb9-be41-a6438149974e\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " Oct 01 12:55:12 crc kubenswrapper[4913]: I1001 12:55:12.876326 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-ovsdbserver-nb\") pod \"7abd6773-6041-4cb9-be41-a6438149974e\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " Oct 01 12:55:12 crc kubenswrapper[4913]: I1001 12:55:12.876385 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-dns-svc\") pod \"7abd6773-6041-4cb9-be41-a6438149974e\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " Oct 01 12:55:12 crc kubenswrapper[4913]: I1001 12:55:12.876450 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-config\") pod \"7abd6773-6041-4cb9-be41-a6438149974e\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " Oct 01 12:55:12 crc kubenswrapper[4913]: I1001 12:55:12.876467 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-ovsdbserver-sb\") pod \"7abd6773-6041-4cb9-be41-a6438149974e\" (UID: \"7abd6773-6041-4cb9-be41-a6438149974e\") " Oct 01 12:55:12 crc kubenswrapper[4913]: I1001 12:55:12.911124 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7abd6773-6041-4cb9-be41-a6438149974e-kube-api-access-8dj27" (OuterVolumeSpecName: "kube-api-access-8dj27") pod "7abd6773-6041-4cb9-be41-a6438149974e" (UID: "7abd6773-6041-4cb9-be41-a6438149974e"). InnerVolumeSpecName "kube-api-access-8dj27". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:12 crc kubenswrapper[4913]: I1001 12:55:12.981429 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dj27\" (UniqueName: \"kubernetes.io/projected/7abd6773-6041-4cb9-be41-a6438149974e-kube-api-access-8dj27\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.009811 4913 generic.go:334] "Generic (PLEG): container finished" podID="7abd6773-6041-4cb9-be41-a6438149974e" containerID="2c0a0570433d20e62be043a04c93a6ba39baa284a79b20f354500f91d2f4eab1" exitCode=0 Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.009883 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" event={"ID":"7abd6773-6041-4cb9-be41-a6438149974e","Type":"ContainerDied","Data":"2c0a0570433d20e62be043a04c93a6ba39baa284a79b20f354500f91d2f4eab1"} Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.009924 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.009968 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf4dd4865-c4gnr" event={"ID":"7abd6773-6041-4cb9-be41-a6438149974e","Type":"ContainerDied","Data":"2ab4b44eed9de4051ea315cbe6ce5b19bdce469d19d610c17476dc245753a74c"} Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.009992 4913 scope.go:117] "RemoveContainer" containerID="2c0a0570433d20e62be043a04c93a6ba39baa284a79b20f354500f91d2f4eab1" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.011668 4913 generic.go:334] "Generic (PLEG): container finished" podID="6d0b7295-6064-4486-a41f-ce0a9332ac61" containerID="f75704b0441669bb251dba0843bef4922a532220bdc9dee01de17ff67340a70a" exitCode=0 Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.011726 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94448955c-ljhxn" event={"ID":"6d0b7295-6064-4486-a41f-ce0a9332ac61","Type":"ContainerDied","Data":"f75704b0441669bb251dba0843bef4922a532220bdc9dee01de17ff67340a70a"} Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.016283 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-70c3-account-create-9wk5c" event={"ID":"c2e6e788-d705-40e2-b19e-23d915ccc7cd","Type":"ContainerStarted","Data":"35811603ce86b87b16e3fd29bfca2c735f17a20dbdeab6d793181e40bcb4ef63"} Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.016327 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-70c3-account-create-9wk5c" event={"ID":"c2e6e788-d705-40e2-b19e-23d915ccc7cd","Type":"ContainerStarted","Data":"de615a74e5c856e9fda39257f167b36bd8a8ca4b018526849e61ad09a6aa29e4"} Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.022523 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-55b9-account-create-v4tbp" event={"ID":"fd92064d-f7ca-4b7c-8596-ccc759c048ad","Type":"ContainerStarted","Data":"327d699d3496bbcf6a5582b1e50a0ced444acf8fbf87978a9f96445422ea93b3"} Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.022568 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-55b9-account-create-v4tbp" event={"ID":"fd92064d-f7ca-4b7c-8596-ccc759c048ad","Type":"ContainerStarted","Data":"832cbcf618ba87b4d3d1a9b203f3ef96e6b9d99ee1a36523e28ad73367775007"} Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.024125 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55679b7754-wczpf" event={"ID":"42099d8f-bc53-4134-9351-cbbc06da162e","Type":"ContainerStarted","Data":"e8aa4d879c1b9facd4633243e99a3286c9623dd78f8dd8232c0011097a805aeb"} Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.030587 4913 scope.go:117] "RemoveContainer" containerID="0c54a02909f848e0e63933350708bd446559614e113e81bb7623ee1a61335f7b" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.036121 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-70c3-account-create-9wk5c" podStartSLOduration=2.036105457 podStartE2EDuration="2.036105457s" podCreationTimestamp="2025-10-01 12:55:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:13.034672808 +0000 UTC m=+1044.938148396" watchObservedRunningTime="2025-10-01 12:55:13.036105457 +0000 UTC m=+1044.939581035" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.088065 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-55b9-account-create-v4tbp" podStartSLOduration=2.088045453 podStartE2EDuration="2.088045453s" podCreationTimestamp="2025-10-01 12:55:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:13.080593038 +0000 UTC m=+1044.984068636" watchObservedRunningTime="2025-10-01 12:55:13.088045453 +0000 UTC m=+1044.991521031" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.118821 4913 scope.go:117] "RemoveContainer" containerID="2c0a0570433d20e62be043a04c93a6ba39baa284a79b20f354500f91d2f4eab1" Oct 01 12:55:13 crc kubenswrapper[4913]: E1001 12:55:13.120736 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0a0570433d20e62be043a04c93a6ba39baa284a79b20f354500f91d2f4eab1\": container with ID starting with 2c0a0570433d20e62be043a04c93a6ba39baa284a79b20f354500f91d2f4eab1 not found: ID does not exist" containerID="2c0a0570433d20e62be043a04c93a6ba39baa284a79b20f354500f91d2f4eab1" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.120779 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0a0570433d20e62be043a04c93a6ba39baa284a79b20f354500f91d2f4eab1"} err="failed to get container status \"2c0a0570433d20e62be043a04c93a6ba39baa284a79b20f354500f91d2f4eab1\": rpc error: code = NotFound desc = could not find container \"2c0a0570433d20e62be043a04c93a6ba39baa284a79b20f354500f91d2f4eab1\": container with ID starting with 2c0a0570433d20e62be043a04c93a6ba39baa284a79b20f354500f91d2f4eab1 not found: ID does not exist" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.120806 4913 scope.go:117] "RemoveContainer" containerID="0c54a02909f848e0e63933350708bd446559614e113e81bb7623ee1a61335f7b" Oct 01 12:55:13 crc kubenswrapper[4913]: E1001 12:55:13.125476 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c54a02909f848e0e63933350708bd446559614e113e81bb7623ee1a61335f7b\": container with ID starting with 0c54a02909f848e0e63933350708bd446559614e113e81bb7623ee1a61335f7b not found: ID does not exist" containerID="0c54a02909f848e0e63933350708bd446559614e113e81bb7623ee1a61335f7b" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.125556 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c54a02909f848e0e63933350708bd446559614e113e81bb7623ee1a61335f7b"} err="failed to get container status \"0c54a02909f848e0e63933350708bd446559614e113e81bb7623ee1a61335f7b\": rpc error: code = NotFound desc = could not find container \"0c54a02909f848e0e63933350708bd446559614e113e81bb7623ee1a61335f7b\": container with ID starting with 0c54a02909f848e0e63933350708bd446559614e113e81bb7623ee1a61335f7b not found: ID does not exist" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.150605 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7abd6773-6041-4cb9-be41-a6438149974e" (UID: "7abd6773-6041-4cb9-be41-a6438149974e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.173340 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7abd6773-6041-4cb9-be41-a6438149974e" (UID: "7abd6773-6041-4cb9-be41-a6438149974e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.180741 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7abd6773-6041-4cb9-be41-a6438149974e" (UID: "7abd6773-6041-4cb9-be41-a6438149974e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.184982 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-config" (OuterVolumeSpecName: "config") pod "7abd6773-6041-4cb9-be41-a6438149974e" (UID: "7abd6773-6041-4cb9-be41-a6438149974e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.185358 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.185386 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.185395 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.185405 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7abd6773-6041-4cb9-be41-a6438149974e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.357073 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cf4dd4865-c4gnr"] Oct 01 12:55:13 crc kubenswrapper[4913]: I1001 12:55:13.365578 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cf4dd4865-c4gnr"] Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.033543 4913 generic.go:334] "Generic (PLEG): container finished" podID="fd92064d-f7ca-4b7c-8596-ccc759c048ad" containerID="327d699d3496bbcf6a5582b1e50a0ced444acf8fbf87978a9f96445422ea93b3" exitCode=0 Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.033718 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-55b9-account-create-v4tbp" event={"ID":"fd92064d-f7ca-4b7c-8596-ccc759c048ad","Type":"ContainerDied","Data":"327d699d3496bbcf6a5582b1e50a0ced444acf8fbf87978a9f96445422ea93b3"} Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.037051 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55679b7754-wczpf" event={"ID":"42099d8f-bc53-4134-9351-cbbc06da162e","Type":"ContainerStarted","Data":"0fe40609f9384f008ab9fdf68c7ef5140574885068b0015cb0d4a010cab1476e"} Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.037080 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55679b7754-wczpf" event={"ID":"42099d8f-bc53-4134-9351-cbbc06da162e","Type":"ContainerStarted","Data":"616da23422fd7788d3b49ea0c6fbaf8f5687080a8af7a1130ba2b1ac449168c7"} Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.037291 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.039091 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c58551fb-1f57-46d7-9209-dc92b0ebb305","Type":"ContainerStarted","Data":"be81fde15c619197f1ecc56545223f098833f99df9f206f8348a372c9f303ee4"} Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.039191 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.042810 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94448955c-ljhxn" event={"ID":"6d0b7295-6064-4486-a41f-ce0a9332ac61","Type":"ContainerStarted","Data":"f8d2aeaf8182357cd05e3c774f510beccd9527b9082ef0ef4ea51e2b2e72d01d"} Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.044698 4913 generic.go:334] "Generic (PLEG): container finished" podID="c2e6e788-d705-40e2-b19e-23d915ccc7cd" containerID="35811603ce86b87b16e3fd29bfca2c735f17a20dbdeab6d793181e40bcb4ef63" exitCode=0 Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.044758 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-70c3-account-create-9wk5c" event={"ID":"c2e6e788-d705-40e2-b19e-23d915ccc7cd","Type":"ContainerDied","Data":"35811603ce86b87b16e3fd29bfca2c735f17a20dbdeab6d793181e40bcb4ef63"} Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.070612 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55679b7754-wczpf" podStartSLOduration=3.07058731 podStartE2EDuration="3.07058731s" podCreationTimestamp="2025-10-01 12:55:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:14.064840032 +0000 UTC m=+1045.968315630" watchObservedRunningTime="2025-10-01 12:55:14.07058731 +0000 UTC m=+1045.974062898" Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.107559 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-94448955c-ljhxn" podStartSLOduration=4.107539745 podStartE2EDuration="4.107539745s" podCreationTimestamp="2025-10-01 12:55:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:14.096894322 +0000 UTC m=+1046.000369910" watchObservedRunningTime="2025-10-01 12:55:14.107539745 +0000 UTC m=+1046.011015323" Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.125002 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.124981253 podStartE2EDuration="4.124981253s" podCreationTimestamp="2025-10-01 12:55:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:14.12196474 +0000 UTC m=+1046.025440328" watchObservedRunningTime="2025-10-01 12:55:14.124981253 +0000 UTC m=+1046.028456831" Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.432811 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.665446 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.665791 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="ceilometer-central-agent" containerID="cri-o://f32d5070ff8f0ead1639f29d9e08814d62604cfbb04d189becaa4cf950263da1" gracePeriod=30 Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.665814 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="proxy-httpd" containerID="cri-o://6bae22fb080983f57bd130e4b1d7119ad7602dcbff44013e501552b5bcdc8f70" gracePeriod=30 Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.665974 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="ceilometer-notification-agent" containerID="cri-o://8e1504779d7618297122a847a998455f5875d6bf883d208ceecb539655235cef" gracePeriod=30 Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.666038 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="sg-core" containerID="cri-o://5dc12d816508368eb5a3aadbf766c998c8622ed86236bdb278cd12573e773dcc" gracePeriod=30 Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.685784 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.744370 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 12:55:14 crc kubenswrapper[4913]: I1001 12:55:14.824674 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7abd6773-6041-4cb9-be41-a6438149974e" path="/var/lib/kubelet/pods/7abd6773-6041-4cb9-be41-a6438149974e/volumes" Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.054965 4913 generic.go:334] "Generic (PLEG): container finished" podID="3b038340-cef3-419a-a1e2-2aa46a7f3ee6" containerID="c868668574f49e75abdcb7f6f977a2e4d3d3d10bfbc90936e59801d010e6d903" exitCode=0 Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.055040 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-szknr" event={"ID":"3b038340-cef3-419a-a1e2-2aa46a7f3ee6","Type":"ContainerDied","Data":"c868668574f49e75abdcb7f6f977a2e4d3d3d10bfbc90936e59801d010e6d903"} Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.058068 4913 generic.go:334] "Generic (PLEG): container finished" podID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerID="6bae22fb080983f57bd130e4b1d7119ad7602dcbff44013e501552b5bcdc8f70" exitCode=0 Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.058095 4913 generic.go:334] "Generic (PLEG): container finished" podID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerID="5dc12d816508368eb5a3aadbf766c998c8622ed86236bdb278cd12573e773dcc" exitCode=2 Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.058815 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ed98c9-9585-44d2-a913-ebdcfa04ac53","Type":"ContainerDied","Data":"6bae22fb080983f57bd130e4b1d7119ad7602dcbff44013e501552b5bcdc8f70"} Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.058845 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ed98c9-9585-44d2-a913-ebdcfa04ac53","Type":"ContainerDied","Data":"5dc12d816508368eb5a3aadbf766c998c8622ed86236bdb278cd12573e773dcc"} Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.059833 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.060258 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.118766 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.514035 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-70c3-account-create-9wk5c" Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.520982 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-55b9-account-create-v4tbp" Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.628443 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtp6k\" (UniqueName: \"kubernetes.io/projected/fd92064d-f7ca-4b7c-8596-ccc759c048ad-kube-api-access-qtp6k\") pod \"fd92064d-f7ca-4b7c-8596-ccc759c048ad\" (UID: \"fd92064d-f7ca-4b7c-8596-ccc759c048ad\") " Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.628610 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp4h6\" (UniqueName: \"kubernetes.io/projected/c2e6e788-d705-40e2-b19e-23d915ccc7cd-kube-api-access-jp4h6\") pod \"c2e6e788-d705-40e2-b19e-23d915ccc7cd\" (UID: \"c2e6e788-d705-40e2-b19e-23d915ccc7cd\") " Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.634022 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e6e788-d705-40e2-b19e-23d915ccc7cd-kube-api-access-jp4h6" (OuterVolumeSpecName: "kube-api-access-jp4h6") pod "c2e6e788-d705-40e2-b19e-23d915ccc7cd" (UID: "c2e6e788-d705-40e2-b19e-23d915ccc7cd"). InnerVolumeSpecName "kube-api-access-jp4h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.637473 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd92064d-f7ca-4b7c-8596-ccc759c048ad-kube-api-access-qtp6k" (OuterVolumeSpecName: "kube-api-access-qtp6k") pod "fd92064d-f7ca-4b7c-8596-ccc759c048ad" (UID: "fd92064d-f7ca-4b7c-8596-ccc759c048ad"). InnerVolumeSpecName "kube-api-access-qtp6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.730162 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtp6k\" (UniqueName: \"kubernetes.io/projected/fd92064d-f7ca-4b7c-8596-ccc759c048ad-kube-api-access-qtp6k\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:15 crc kubenswrapper[4913]: I1001 12:55:15.730206 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp4h6\" (UniqueName: \"kubernetes.io/projected/c2e6e788-d705-40e2-b19e-23d915ccc7cd-kube-api-access-jp4h6\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.068597 4913 generic.go:334] "Generic (PLEG): container finished" podID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerID="f32d5070ff8f0ead1639f29d9e08814d62604cfbb04d189becaa4cf950263da1" exitCode=0 Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.068654 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ed98c9-9585-44d2-a913-ebdcfa04ac53","Type":"ContainerDied","Data":"f32d5070ff8f0ead1639f29d9e08814d62604cfbb04d189becaa4cf950263da1"} Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.070198 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-55b9-account-create-v4tbp" event={"ID":"fd92064d-f7ca-4b7c-8596-ccc759c048ad","Type":"ContainerDied","Data":"832cbcf618ba87b4d3d1a9b203f3ef96e6b9d99ee1a36523e28ad73367775007"} Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.070223 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="832cbcf618ba87b4d3d1a9b203f3ef96e6b9d99ee1a36523e28ad73367775007" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.070281 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-55b9-account-create-v4tbp" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.076281 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-70c3-account-create-9wk5c" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.085495 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-70c3-account-create-9wk5c" event={"ID":"c2e6e788-d705-40e2-b19e-23d915ccc7cd","Type":"ContainerDied","Data":"de615a74e5c856e9fda39257f167b36bd8a8ca4b018526849e61ad09a6aa29e4"} Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.085579 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de615a74e5c856e9fda39257f167b36bd8a8ca4b018526849e61ad09a6aa29e4" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.085781 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ee5a8249-034c-4e5a-b562-f98954678ca0" containerName="cinder-scheduler" containerID="cri-o://0e2f75a442ef6de425ea9860fbc1e62337133deef64db4560b0a155b3209d132" gracePeriod=30 Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.085793 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ee5a8249-034c-4e5a-b562-f98954678ca0" containerName="probe" containerID="cri-o://2111baf66902bee324e8b60fa58e5afac34f2186318babe9a339e7b5505f7e46" gracePeriod=30 Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.340309 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.383784 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.464343 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-szknr" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.546030 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dfvx\" (UniqueName: \"kubernetes.io/projected/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-kube-api-access-8dfvx\") pod \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\" (UID: \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\") " Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.546067 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-combined-ca-bundle\") pod \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\" (UID: \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\") " Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.546156 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-config\") pod \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\" (UID: \"3b038340-cef3-419a-a1e2-2aa46a7f3ee6\") " Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.553732 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-kube-api-access-8dfvx" (OuterVolumeSpecName: "kube-api-access-8dfvx") pod "3b038340-cef3-419a-a1e2-2aa46a7f3ee6" (UID: "3b038340-cef3-419a-a1e2-2aa46a7f3ee6"). InnerVolumeSpecName "kube-api-access-8dfvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.617703 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t5ghk"] Oct 01 12:55:16 crc kubenswrapper[4913]: E1001 12:55:16.618056 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abd6773-6041-4cb9-be41-a6438149974e" containerName="init" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.618072 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abd6773-6041-4cb9-be41-a6438149974e" containerName="init" Oct 01 12:55:16 crc kubenswrapper[4913]: E1001 12:55:16.618088 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e6e788-d705-40e2-b19e-23d915ccc7cd" containerName="mariadb-account-create" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.618094 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e6e788-d705-40e2-b19e-23d915ccc7cd" containerName="mariadb-account-create" Oct 01 12:55:16 crc kubenswrapper[4913]: E1001 12:55:16.618105 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b038340-cef3-419a-a1e2-2aa46a7f3ee6" containerName="neutron-db-sync" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.618111 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b038340-cef3-419a-a1e2-2aa46a7f3ee6" containerName="neutron-db-sync" Oct 01 12:55:16 crc kubenswrapper[4913]: E1001 12:55:16.618126 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd92064d-f7ca-4b7c-8596-ccc759c048ad" containerName="mariadb-account-create" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.618134 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd92064d-f7ca-4b7c-8596-ccc759c048ad" containerName="mariadb-account-create" Oct 01 12:55:16 crc kubenswrapper[4913]: E1001 12:55:16.618148 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abd6773-6041-4cb9-be41-a6438149974e" containerName="dnsmasq-dns" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.618153 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abd6773-6041-4cb9-be41-a6438149974e" containerName="dnsmasq-dns" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.618321 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="7abd6773-6041-4cb9-be41-a6438149974e" containerName="dnsmasq-dns" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.618336 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2e6e788-d705-40e2-b19e-23d915ccc7cd" containerName="mariadb-account-create" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.618345 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b038340-cef3-419a-a1e2-2aa46a7f3ee6" containerName="neutron-db-sync" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.618356 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd92064d-f7ca-4b7c-8596-ccc759c048ad" containerName="mariadb-account-create" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.618924 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.624098 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.624201 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jz8n5" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.624256 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.642410 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t5ghk"] Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.648234 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dfvx\" (UniqueName: \"kubernetes.io/projected/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-kube-api-access-8dfvx\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.652977 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-config" (OuterVolumeSpecName: "config") pod "3b038340-cef3-419a-a1e2-2aa46a7f3ee6" (UID: "3b038340-cef3-419a-a1e2-2aa46a7f3ee6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.655568 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b038340-cef3-419a-a1e2-2aa46a7f3ee6" (UID: "3b038340-cef3-419a-a1e2-2aa46a7f3ee6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.749789 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t5ghk\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.749916 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-config-data\") pod \"nova-cell0-conductor-db-sync-t5ghk\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.749957 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gqv6\" (UniqueName: \"kubernetes.io/projected/ba0e4f52-1a20-4443-98c4-03620eec847f-kube-api-access-2gqv6\") pod \"nova-cell0-conductor-db-sync-t5ghk\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.749989 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-scripts\") pod \"nova-cell0-conductor-db-sync-t5ghk\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.750083 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.750100 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b038340-cef3-419a-a1e2-2aa46a7f3ee6-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.851631 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-scripts\") pod \"nova-cell0-conductor-db-sync-t5ghk\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.852333 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t5ghk\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.852459 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-config-data\") pod \"nova-cell0-conductor-db-sync-t5ghk\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.852495 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gqv6\" (UniqueName: \"kubernetes.io/projected/ba0e4f52-1a20-4443-98c4-03620eec847f-kube-api-access-2gqv6\") pod \"nova-cell0-conductor-db-sync-t5ghk\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.857837 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-scripts\") pod \"nova-cell0-conductor-db-sync-t5ghk\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.857876 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t5ghk\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.860072 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-config-data\") pod \"nova-cell0-conductor-db-sync-t5ghk\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:16 crc kubenswrapper[4913]: I1001 12:55:16.870209 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gqv6\" (UniqueName: \"kubernetes.io/projected/ba0e4f52-1a20-4443-98c4-03620eec847f-kube-api-access-2gqv6\") pod \"nova-cell0-conductor-db-sync-t5ghk\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.020418 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.085557 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-szknr" event={"ID":"3b038340-cef3-419a-a1e2-2aa46a7f3ee6","Type":"ContainerDied","Data":"7e2b3ddf726fa717a7f0ee62fd4b72b0915ad2ea932afbdac0243d4fa82c1b95"} Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.085599 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e2b3ddf726fa717a7f0ee62fd4b72b0915ad2ea932afbdac0243d4fa82c1b95" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.085614 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-szknr" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.106191 4913 generic.go:334] "Generic (PLEG): container finished" podID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerID="8e1504779d7618297122a847a998455f5875d6bf883d208ceecb539655235cef" exitCode=0 Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.106328 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ed98c9-9585-44d2-a913-ebdcfa04ac53","Type":"ContainerDied","Data":"8e1504779d7618297122a847a998455f5875d6bf883d208ceecb539655235cef"} Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.108055 4913 generic.go:334] "Generic (PLEG): container finished" podID="ee5a8249-034c-4e5a-b562-f98954678ca0" containerID="2111baf66902bee324e8b60fa58e5afac34f2186318babe9a339e7b5505f7e46" exitCode=0 Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.108120 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ee5a8249-034c-4e5a-b562-f98954678ca0","Type":"ContainerDied","Data":"2111baf66902bee324e8b60fa58e5afac34f2186318babe9a339e7b5505f7e46"} Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.321068 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94448955c-ljhxn"] Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.327116 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-94448955c-ljhxn" podUID="6d0b7295-6064-4486-a41f-ce0a9332ac61" containerName="dnsmasq-dns" containerID="cri-o://f8d2aeaf8182357cd05e3c774f510beccd9527b9082ef0ef4ea51e2b2e72d01d" gracePeriod=10 Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.375619 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-659bcf5cf5-8vzps"] Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.382512 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.406945 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-659bcf5cf5-8vzps"] Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.465133 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-ovsdbserver-nb\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.465190 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4zm7\" (UniqueName: \"kubernetes.io/projected/19e92ca3-0ea8-408d-902d-d0c6e283129f-kube-api-access-g4zm7\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.465277 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-dns-svc\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.465315 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-ovsdbserver-sb\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.465351 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-config\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.472198 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5759df9d4d-h2pz9"] Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.473628 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.476882 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.477042 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k6bs9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.477184 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.477341 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.481806 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5759df9d4d-h2pz9"] Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.526197 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t5ghk"] Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.568548 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-ovndb-tls-certs\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.568660 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-dns-svc\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.568741 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-ovsdbserver-sb\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.569722 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-dns-svc\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.569796 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-ovsdbserver-sb\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.569831 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-config\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.569918 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-httpd-config\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.569961 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-ovsdbserver-nb\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.570019 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4zm7\" (UniqueName: \"kubernetes.io/projected/19e92ca3-0ea8-408d-902d-d0c6e283129f-kube-api-access-g4zm7\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.570067 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-combined-ca-bundle\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.570089 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-config\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.570115 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85wk2\" (UniqueName: \"kubernetes.io/projected/839626e4-5aad-4abf-b758-80755e37b5b3-kube-api-access-85wk2\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.570135 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-config\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.572579 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-ovsdbserver-nb\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.589587 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4zm7\" (UniqueName: \"kubernetes.io/projected/19e92ca3-0ea8-408d-902d-d0c6e283129f-kube-api-access-g4zm7\") pod \"dnsmasq-dns-659bcf5cf5-8vzps\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.671490 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-httpd-config\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.671827 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-combined-ca-bundle\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.672194 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-config\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.672218 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85wk2\" (UniqueName: \"kubernetes.io/projected/839626e4-5aad-4abf-b758-80755e37b5b3-kube-api-access-85wk2\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.672244 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-ovndb-tls-certs\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.675689 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-combined-ca-bundle\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.677373 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-ovndb-tls-certs\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.677636 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-config\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.680821 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-httpd-config\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.692010 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85wk2\" (UniqueName: \"kubernetes.io/projected/839626e4-5aad-4abf-b758-80755e37b5b3-kube-api-access-85wk2\") pod \"neutron-5759df9d4d-h2pz9\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.743150 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:17 crc kubenswrapper[4913]: I1001 12:55:17.806590 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.061589 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.072090 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.160765 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ed98c9-9585-44d2-a913-ebdcfa04ac53","Type":"ContainerDied","Data":"05d7293ca7da7b6b95bc2a89a9735bd79134c08b7644776ae078775c97025fc5"} Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.160815 4913 scope.go:117] "RemoveContainer" containerID="6bae22fb080983f57bd130e4b1d7119ad7602dcbff44013e501552b5bcdc8f70" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.160951 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.171430 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t5ghk" event={"ID":"ba0e4f52-1a20-4443-98c4-03620eec847f","Type":"ContainerStarted","Data":"2ea498f1b799cf45b13eab183b39e90f74a8fe8940a52972ea96a5af52a42788"} Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.173627 4913 generic.go:334] "Generic (PLEG): container finished" podID="6d0b7295-6064-4486-a41f-ce0a9332ac61" containerID="f8d2aeaf8182357cd05e3c774f510beccd9527b9082ef0ef4ea51e2b2e72d01d" exitCode=0 Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.173659 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94448955c-ljhxn" event={"ID":"6d0b7295-6064-4486-a41f-ce0a9332ac61","Type":"ContainerDied","Data":"f8d2aeaf8182357cd05e3c774f510beccd9527b9082ef0ef4ea51e2b2e72d01d"} Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.173675 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94448955c-ljhxn" event={"ID":"6d0b7295-6064-4486-a41f-ce0a9332ac61","Type":"ContainerDied","Data":"d246ecb4a6011f640fa33cd84bad38929461117825b5f9e935565b144fd7d596"} Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.173745 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94448955c-ljhxn" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.200392 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72g8x\" (UniqueName: \"kubernetes.io/projected/51ed98c9-9585-44d2-a913-ebdcfa04ac53-kube-api-access-72g8x\") pod \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.200444 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ed98c9-9585-44d2-a913-ebdcfa04ac53-log-httpd\") pod \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.200492 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-ovsdbserver-nb\") pod \"6d0b7295-6064-4486-a41f-ce0a9332ac61\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.200514 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tw6d\" (UniqueName: \"kubernetes.io/projected/6d0b7295-6064-4486-a41f-ce0a9332ac61-kube-api-access-5tw6d\") pod \"6d0b7295-6064-4486-a41f-ce0a9332ac61\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.200540 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-combined-ca-bundle\") pod \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.200573 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-ovsdbserver-sb\") pod \"6d0b7295-6064-4486-a41f-ce0a9332ac61\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.200600 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-sg-core-conf-yaml\") pod \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.200623 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-config-data\") pod \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.200651 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-config\") pod \"6d0b7295-6064-4486-a41f-ce0a9332ac61\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.200723 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-scripts\") pod \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.200762 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ed98c9-9585-44d2-a913-ebdcfa04ac53-run-httpd\") pod \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\" (UID: \"51ed98c9-9585-44d2-a913-ebdcfa04ac53\") " Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.200784 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-dns-svc\") pod \"6d0b7295-6064-4486-a41f-ce0a9332ac61\" (UID: \"6d0b7295-6064-4486-a41f-ce0a9332ac61\") " Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.216810 4913 scope.go:117] "RemoveContainer" containerID="5dc12d816508368eb5a3aadbf766c998c8622ed86236bdb278cd12573e773dcc" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.224691 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ed98c9-9585-44d2-a913-ebdcfa04ac53-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "51ed98c9-9585-44d2-a913-ebdcfa04ac53" (UID: "51ed98c9-9585-44d2-a913-ebdcfa04ac53"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.225063 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ed98c9-9585-44d2-a913-ebdcfa04ac53-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "51ed98c9-9585-44d2-a913-ebdcfa04ac53" (UID: "51ed98c9-9585-44d2-a913-ebdcfa04ac53"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.250576 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-scripts" (OuterVolumeSpecName: "scripts") pod "51ed98c9-9585-44d2-a913-ebdcfa04ac53" (UID: "51ed98c9-9585-44d2-a913-ebdcfa04ac53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.263571 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d0b7295-6064-4486-a41f-ce0a9332ac61-kube-api-access-5tw6d" (OuterVolumeSpecName: "kube-api-access-5tw6d") pod "6d0b7295-6064-4486-a41f-ce0a9332ac61" (UID: "6d0b7295-6064-4486-a41f-ce0a9332ac61"). InnerVolumeSpecName "kube-api-access-5tw6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.283460 4913 scope.go:117] "RemoveContainer" containerID="8e1504779d7618297122a847a998455f5875d6bf883d208ceecb539655235cef" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.283714 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ed98c9-9585-44d2-a913-ebdcfa04ac53-kube-api-access-72g8x" (OuterVolumeSpecName: "kube-api-access-72g8x") pod "51ed98c9-9585-44d2-a913-ebdcfa04ac53" (UID: "51ed98c9-9585-44d2-a913-ebdcfa04ac53"). InnerVolumeSpecName "kube-api-access-72g8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.303401 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.303426 4913 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ed98c9-9585-44d2-a913-ebdcfa04ac53-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.303437 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72g8x\" (UniqueName: \"kubernetes.io/projected/51ed98c9-9585-44d2-a913-ebdcfa04ac53-kube-api-access-72g8x\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.303445 4913 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ed98c9-9585-44d2-a913-ebdcfa04ac53-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.303454 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tw6d\" (UniqueName: \"kubernetes.io/projected/6d0b7295-6064-4486-a41f-ce0a9332ac61-kube-api-access-5tw6d\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.353042 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6d0b7295-6064-4486-a41f-ce0a9332ac61" (UID: "6d0b7295-6064-4486-a41f-ce0a9332ac61"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.362699 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-659bcf5cf5-8vzps"] Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.385393 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d0b7295-6064-4486-a41f-ce0a9332ac61" (UID: "6d0b7295-6064-4486-a41f-ce0a9332ac61"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.387872 4913 scope.go:117] "RemoveContainer" containerID="f32d5070ff8f0ead1639f29d9e08814d62604cfbb04d189becaa4cf950263da1" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.390821 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6d0b7295-6064-4486-a41f-ce0a9332ac61" (UID: "6d0b7295-6064-4486-a41f-ce0a9332ac61"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.404609 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.404631 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.404641 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.406452 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "51ed98c9-9585-44d2-a913-ebdcfa04ac53" (UID: "51ed98c9-9585-44d2-a913-ebdcfa04ac53"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.433476 4913 scope.go:117] "RemoveContainer" containerID="f8d2aeaf8182357cd05e3c774f510beccd9527b9082ef0ef4ea51e2b2e72d01d" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.446400 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-config" (OuterVolumeSpecName: "config") pod "6d0b7295-6064-4486-a41f-ce0a9332ac61" (UID: "6d0b7295-6064-4486-a41f-ce0a9332ac61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.453370 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51ed98c9-9585-44d2-a913-ebdcfa04ac53" (UID: "51ed98c9-9585-44d2-a913-ebdcfa04ac53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.459113 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5759df9d4d-h2pz9"] Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.479154 4913 scope.go:117] "RemoveContainer" containerID="f75704b0441669bb251dba0843bef4922a532220bdc9dee01de17ff67340a70a" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.495386 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-config-data" (OuterVolumeSpecName: "config-data") pod "51ed98c9-9585-44d2-a913-ebdcfa04ac53" (UID: "51ed98c9-9585-44d2-a913-ebdcfa04ac53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:18 crc kubenswrapper[4913]: W1001 12:55:18.499667 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod839626e4_5aad_4abf_b758_80755e37b5b3.slice/crio-0e81c03d8deaa45230204b9bedbfca94079e33477b480922eae275504d6a09b1 WatchSource:0}: Error finding container 0e81c03d8deaa45230204b9bedbfca94079e33477b480922eae275504d6a09b1: Status 404 returned error can't find the container with id 0e81c03d8deaa45230204b9bedbfca94079e33477b480922eae275504d6a09b1 Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.506787 4913 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.506814 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.506823 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d0b7295-6064-4486-a41f-ce0a9332ac61-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.506831 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ed98c9-9585-44d2-a913-ebdcfa04ac53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.533597 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94448955c-ljhxn"] Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.547492 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-94448955c-ljhxn"] Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.565055 4913 scope.go:117] "RemoveContainer" containerID="f8d2aeaf8182357cd05e3c774f510beccd9527b9082ef0ef4ea51e2b2e72d01d" Oct 01 12:55:18 crc kubenswrapper[4913]: E1001 12:55:18.565420 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d2aeaf8182357cd05e3c774f510beccd9527b9082ef0ef4ea51e2b2e72d01d\": container with ID starting with f8d2aeaf8182357cd05e3c774f510beccd9527b9082ef0ef4ea51e2b2e72d01d not found: ID does not exist" containerID="f8d2aeaf8182357cd05e3c774f510beccd9527b9082ef0ef4ea51e2b2e72d01d" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.565449 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d2aeaf8182357cd05e3c774f510beccd9527b9082ef0ef4ea51e2b2e72d01d"} err="failed to get container status \"f8d2aeaf8182357cd05e3c774f510beccd9527b9082ef0ef4ea51e2b2e72d01d\": rpc error: code = NotFound desc = could not find container \"f8d2aeaf8182357cd05e3c774f510beccd9527b9082ef0ef4ea51e2b2e72d01d\": container with ID starting with f8d2aeaf8182357cd05e3c774f510beccd9527b9082ef0ef4ea51e2b2e72d01d not found: ID does not exist" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.565468 4913 scope.go:117] "RemoveContainer" containerID="f75704b0441669bb251dba0843bef4922a532220bdc9dee01de17ff67340a70a" Oct 01 12:55:18 crc kubenswrapper[4913]: E1001 12:55:18.565843 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f75704b0441669bb251dba0843bef4922a532220bdc9dee01de17ff67340a70a\": container with ID starting with f75704b0441669bb251dba0843bef4922a532220bdc9dee01de17ff67340a70a not found: ID does not exist" containerID="f75704b0441669bb251dba0843bef4922a532220bdc9dee01de17ff67340a70a" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.565869 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75704b0441669bb251dba0843bef4922a532220bdc9dee01de17ff67340a70a"} err="failed to get container status \"f75704b0441669bb251dba0843bef4922a532220bdc9dee01de17ff67340a70a\": rpc error: code = NotFound desc = could not find container \"f75704b0441669bb251dba0843bef4922a532220bdc9dee01de17ff67340a70a\": container with ID starting with f75704b0441669bb251dba0843bef4922a532220bdc9dee01de17ff67340a70a not found: ID does not exist" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.866609 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d0b7295-6064-4486-a41f-ce0a9332ac61" path="/var/lib/kubelet/pods/6d0b7295-6064-4486-a41f-ce0a9332ac61/volumes" Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.946751 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:18 crc kubenswrapper[4913]: I1001 12:55:18.952821 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.017395 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:19 crc kubenswrapper[4913]: E1001 12:55:19.017769 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d0b7295-6064-4486-a41f-ce0a9332ac61" containerName="dnsmasq-dns" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.017792 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d0b7295-6064-4486-a41f-ce0a9332ac61" containerName="dnsmasq-dns" Oct 01 12:55:19 crc kubenswrapper[4913]: E1001 12:55:19.017810 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d0b7295-6064-4486-a41f-ce0a9332ac61" containerName="init" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.017816 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d0b7295-6064-4486-a41f-ce0a9332ac61" containerName="init" Oct 01 12:55:19 crc kubenswrapper[4913]: E1001 12:55:19.017830 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="proxy-httpd" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.017837 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="proxy-httpd" Oct 01 12:55:19 crc kubenswrapper[4913]: E1001 12:55:19.017851 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="ceilometer-notification-agent" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.017861 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="ceilometer-notification-agent" Oct 01 12:55:19 crc kubenswrapper[4913]: E1001 12:55:19.017877 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="ceilometer-central-agent" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.017884 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="ceilometer-central-agent" Oct 01 12:55:19 crc kubenswrapper[4913]: E1001 12:55:19.017894 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="sg-core" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.017901 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="sg-core" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.018063 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d0b7295-6064-4486-a41f-ce0a9332ac61" containerName="dnsmasq-dns" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.018084 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="sg-core" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.018091 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="ceilometer-central-agent" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.018101 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="proxy-httpd" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.018113 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="ceilometer-notification-agent" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.019604 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.025568 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.025757 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.067828 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.119441 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.119487 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72edfef8-0b60-42d5-a649-398d42c9daa7-run-httpd\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.119510 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.119585 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72edfef8-0b60-42d5-a649-398d42c9daa7-log-httpd\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.119603 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcwst\" (UniqueName: \"kubernetes.io/projected/72edfef8-0b60-42d5-a649-398d42c9daa7-kube-api-access-hcwst\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.119627 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-scripts\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.119653 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-config-data\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.177994 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.209076 4913 generic.go:334] "Generic (PLEG): container finished" podID="19e92ca3-0ea8-408d-902d-d0c6e283129f" containerID="0add67691b0d7dfbf83722bb94081213fbd4c826adab54dff271ed8ce956d1f6" exitCode=0 Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.209143 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" event={"ID":"19e92ca3-0ea8-408d-902d-d0c6e283129f","Type":"ContainerDied","Data":"0add67691b0d7dfbf83722bb94081213fbd4c826adab54dff271ed8ce956d1f6"} Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.209170 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" event={"ID":"19e92ca3-0ea8-408d-902d-d0c6e283129f","Type":"ContainerStarted","Data":"79176e0a1437531fc84f94fd9e546ad06693fe8fa88dfe2bd6b33e8a8d608187"} Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.225540 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-config-data\") pod \"ee5a8249-034c-4e5a-b562-f98954678ca0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.225621 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg7sv\" (UniqueName: \"kubernetes.io/projected/ee5a8249-034c-4e5a-b562-f98954678ca0-kube-api-access-mg7sv\") pod \"ee5a8249-034c-4e5a-b562-f98954678ca0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.225737 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-scripts\") pod \"ee5a8249-034c-4e5a-b562-f98954678ca0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.225759 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee5a8249-034c-4e5a-b562-f98954678ca0-etc-machine-id\") pod \"ee5a8249-034c-4e5a-b562-f98954678ca0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.225812 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-config-data-custom\") pod \"ee5a8249-034c-4e5a-b562-f98954678ca0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.225833 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-combined-ca-bundle\") pod \"ee5a8249-034c-4e5a-b562-f98954678ca0\" (UID: \"ee5a8249-034c-4e5a-b562-f98954678ca0\") " Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.226064 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72edfef8-0b60-42d5-a649-398d42c9daa7-log-httpd\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.226094 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcwst\" (UniqueName: \"kubernetes.io/projected/72edfef8-0b60-42d5-a649-398d42c9daa7-kube-api-access-hcwst\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.226121 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-scripts\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.226144 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-config-data\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.226196 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.226215 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72edfef8-0b60-42d5-a649-398d42c9daa7-run-httpd\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.226233 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.228828 4913 generic.go:334] "Generic (PLEG): container finished" podID="ee5a8249-034c-4e5a-b562-f98954678ca0" containerID="0e2f75a442ef6de425ea9860fbc1e62337133deef64db4560b0a155b3209d132" exitCode=0 Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.228915 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ee5a8249-034c-4e5a-b562-f98954678ca0","Type":"ContainerDied","Data":"0e2f75a442ef6de425ea9860fbc1e62337133deef64db4560b0a155b3209d132"} Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.228945 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ee5a8249-034c-4e5a-b562-f98954678ca0","Type":"ContainerDied","Data":"8908d22b986ef4aca28e97d222d3ce6ed36533f0fa93ebf240ff9714e9e9feb3"} Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.228962 4913 scope.go:117] "RemoveContainer" containerID="2111baf66902bee324e8b60fa58e5afac34f2186318babe9a339e7b5505f7e46" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.229097 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.229378 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee5a8249-034c-4e5a-b562-f98954678ca0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ee5a8249-034c-4e5a-b562-f98954678ca0" (UID: "ee5a8249-034c-4e5a-b562-f98954678ca0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.229996 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72edfef8-0b60-42d5-a649-398d42c9daa7-log-httpd\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.234795 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.235449 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72edfef8-0b60-42d5-a649-398d42c9daa7-run-httpd\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.235729 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-scripts\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.244912 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee5a8249-034c-4e5a-b562-f98954678ca0-kube-api-access-mg7sv" (OuterVolumeSpecName: "kube-api-access-mg7sv") pod "ee5a8249-034c-4e5a-b562-f98954678ca0" (UID: "ee5a8249-034c-4e5a-b562-f98954678ca0"). InnerVolumeSpecName "kube-api-access-mg7sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.246084 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5759df9d4d-h2pz9" event={"ID":"839626e4-5aad-4abf-b758-80755e37b5b3","Type":"ContainerStarted","Data":"dad297f85cc097635e915f216c866d1131fbd782e96b3725514e98b52ba268af"} Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.246152 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5759df9d4d-h2pz9" event={"ID":"839626e4-5aad-4abf-b758-80755e37b5b3","Type":"ContainerStarted","Data":"0e81c03d8deaa45230204b9bedbfca94079e33477b480922eae275504d6a09b1"} Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.253777 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-config-data\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.257847 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.259701 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-scripts" (OuterVolumeSpecName: "scripts") pod "ee5a8249-034c-4e5a-b562-f98954678ca0" (UID: "ee5a8249-034c-4e5a-b562-f98954678ca0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.269862 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ee5a8249-034c-4e5a-b562-f98954678ca0" (UID: "ee5a8249-034c-4e5a-b562-f98954678ca0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.274005 4913 scope.go:117] "RemoveContainer" containerID="0e2f75a442ef6de425ea9860fbc1e62337133deef64db4560b0a155b3209d132" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.275848 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcwst\" (UniqueName: \"kubernetes.io/projected/72edfef8-0b60-42d5-a649-398d42c9daa7-kube-api-access-hcwst\") pod \"ceilometer-0\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.301560 4913 scope.go:117] "RemoveContainer" containerID="2111baf66902bee324e8b60fa58e5afac34f2186318babe9a339e7b5505f7e46" Oct 01 12:55:19 crc kubenswrapper[4913]: E1001 12:55:19.301907 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2111baf66902bee324e8b60fa58e5afac34f2186318babe9a339e7b5505f7e46\": container with ID starting with 2111baf66902bee324e8b60fa58e5afac34f2186318babe9a339e7b5505f7e46 not found: ID does not exist" containerID="2111baf66902bee324e8b60fa58e5afac34f2186318babe9a339e7b5505f7e46" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.301942 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2111baf66902bee324e8b60fa58e5afac34f2186318babe9a339e7b5505f7e46"} err="failed to get container status \"2111baf66902bee324e8b60fa58e5afac34f2186318babe9a339e7b5505f7e46\": rpc error: code = NotFound desc = could not find container \"2111baf66902bee324e8b60fa58e5afac34f2186318babe9a339e7b5505f7e46\": container with ID starting with 2111baf66902bee324e8b60fa58e5afac34f2186318babe9a339e7b5505f7e46 not found: ID does not exist" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.301964 4913 scope.go:117] "RemoveContainer" containerID="0e2f75a442ef6de425ea9860fbc1e62337133deef64db4560b0a155b3209d132" Oct 01 12:55:19 crc kubenswrapper[4913]: E1001 12:55:19.302179 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e2f75a442ef6de425ea9860fbc1e62337133deef64db4560b0a155b3209d132\": container with ID starting with 0e2f75a442ef6de425ea9860fbc1e62337133deef64db4560b0a155b3209d132 not found: ID does not exist" containerID="0e2f75a442ef6de425ea9860fbc1e62337133deef64db4560b0a155b3209d132" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.302208 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2f75a442ef6de425ea9860fbc1e62337133deef64db4560b0a155b3209d132"} err="failed to get container status \"0e2f75a442ef6de425ea9860fbc1e62337133deef64db4560b0a155b3209d132\": rpc error: code = NotFound desc = could not find container \"0e2f75a442ef6de425ea9860fbc1e62337133deef64db4560b0a155b3209d132\": container with ID starting with 0e2f75a442ef6de425ea9860fbc1e62337133deef64db4560b0a155b3209d132 not found: ID does not exist" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.328446 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg7sv\" (UniqueName: \"kubernetes.io/projected/ee5a8249-034c-4e5a-b562-f98954678ca0-kube-api-access-mg7sv\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.328472 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.328481 4913 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee5a8249-034c-4e5a-b562-f98954678ca0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.328490 4913 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.421310 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.422098 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.451606 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee5a8249-034c-4e5a-b562-f98954678ca0" (UID: "ee5a8249-034c-4e5a-b562-f98954678ca0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.530581 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.533421 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-config-data" (OuterVolumeSpecName: "config-data") pod "ee5a8249-034c-4e5a-b562-f98954678ca0" (UID: "ee5a8249-034c-4e5a-b562-f98954678ca0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.633500 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5a8249-034c-4e5a-b562-f98954678ca0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.896516 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.904463 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.921694 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:55:19 crc kubenswrapper[4913]: E1001 12:55:19.922196 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5a8249-034c-4e5a-b562-f98954678ca0" containerName="probe" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.922214 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5a8249-034c-4e5a-b562-f98954678ca0" containerName="probe" Oct 01 12:55:19 crc kubenswrapper[4913]: E1001 12:55:19.922233 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5a8249-034c-4e5a-b562-f98954678ca0" containerName="cinder-scheduler" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.922240 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5a8249-034c-4e5a-b562-f98954678ca0" containerName="cinder-scheduler" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.922446 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee5a8249-034c-4e5a-b562-f98954678ca0" containerName="probe" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.922464 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee5a8249-034c-4e5a-b562-f98954678ca0" containerName="cinder-scheduler" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.924161 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.926305 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.935827 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:55:19 crc kubenswrapper[4913]: I1001 12:55:19.995487 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:20 crc kubenswrapper[4913]: W1001 12:55:20.022031 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72edfef8_0b60_42d5_a649_398d42c9daa7.slice/crio-6c3bd8c3cafefe468b49a78a9e665af2169727471094c793563d741459baea0c WatchSource:0}: Error finding container 6c3bd8c3cafefe468b49a78a9e665af2169727471094c793563d741459baea0c: Status 404 returned error can't find the container with id 6c3bd8c3cafefe468b49a78a9e665af2169727471094c793563d741459baea0c Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.050440 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q5tc\" (UniqueName: \"kubernetes.io/projected/3708ac45-f021-44ed-8c85-e34c2ed73241-kube-api-access-6q5tc\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.050516 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3708ac45-f021-44ed-8c85-e34c2ed73241-config-data\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.050664 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3708ac45-f021-44ed-8c85-e34c2ed73241-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.050839 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3708ac45-f021-44ed-8c85-e34c2ed73241-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.050974 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3708ac45-f021-44ed-8c85-e34c2ed73241-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.051006 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3708ac45-f021-44ed-8c85-e34c2ed73241-scripts\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.152896 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3708ac45-f021-44ed-8c85-e34c2ed73241-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.152950 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3708ac45-f021-44ed-8c85-e34c2ed73241-scripts\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.152987 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q5tc\" (UniqueName: \"kubernetes.io/projected/3708ac45-f021-44ed-8c85-e34c2ed73241-kube-api-access-6q5tc\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.153039 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3708ac45-f021-44ed-8c85-e34c2ed73241-config-data\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.153078 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3708ac45-f021-44ed-8c85-e34c2ed73241-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.153116 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3708ac45-f021-44ed-8c85-e34c2ed73241-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.153902 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3708ac45-f021-44ed-8c85-e34c2ed73241-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.159172 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3708ac45-f021-44ed-8c85-e34c2ed73241-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.159705 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3708ac45-f021-44ed-8c85-e34c2ed73241-config-data\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.159793 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3708ac45-f021-44ed-8c85-e34c2ed73241-scripts\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.161771 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3708ac45-f021-44ed-8c85-e34c2ed73241-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.172714 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q5tc\" (UniqueName: \"kubernetes.io/projected/3708ac45-f021-44ed-8c85-e34c2ed73241-kube-api-access-6q5tc\") pod \"cinder-scheduler-0\" (UID: \"3708ac45-f021-44ed-8c85-e34c2ed73241\") " pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.239590 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.270492 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72edfef8-0b60-42d5-a649-398d42c9daa7","Type":"ContainerStarted","Data":"6c3bd8c3cafefe468b49a78a9e665af2169727471094c793563d741459baea0c"} Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.280470 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5759df9d4d-h2pz9" event={"ID":"839626e4-5aad-4abf-b758-80755e37b5b3","Type":"ContainerStarted","Data":"74797fe20c5f882f1deddc601251f5765b0a34705cd4616ab5cb24147e03b90a"} Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.280609 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.295522 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" event={"ID":"19e92ca3-0ea8-408d-902d-d0c6e283129f","Type":"ContainerStarted","Data":"8b186600a4b73e6dae5d57fa49e1fec309885f18083f6cb414480d396b9efba8"} Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.295611 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.309567 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5759df9d4d-h2pz9" podStartSLOduration=3.309548987 podStartE2EDuration="3.309548987s" podCreationTimestamp="2025-10-01 12:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:20.304956132 +0000 UTC m=+1052.208431730" watchObservedRunningTime="2025-10-01 12:55:20.309548987 +0000 UTC m=+1052.213024565" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.347490 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" podStartSLOduration=3.347460358 podStartE2EDuration="3.347460358s" podCreationTimestamp="2025-10-01 12:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:20.325632109 +0000 UTC m=+1052.229107687" watchObservedRunningTime="2025-10-01 12:55:20.347460358 +0000 UTC m=+1052.250935936" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.611337 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6854dd75d7-6cgpn"] Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.613332 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.615684 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.618786 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.632547 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6854dd75d7-6cgpn"] Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.676291 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-config\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.676350 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-httpd-config\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.676398 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz7x5\" (UniqueName: \"kubernetes.io/projected/6278eaca-e01e-4eb1-9c7f-e12fc399606a-kube-api-access-hz7x5\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.676458 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-ovndb-tls-certs\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.676476 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-public-tls-certs\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.676496 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-internal-tls-certs\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.676543 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-combined-ca-bundle\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.778094 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-config\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.778375 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-httpd-config\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.778414 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz7x5\" (UniqueName: \"kubernetes.io/projected/6278eaca-e01e-4eb1-9c7f-e12fc399606a-kube-api-access-hz7x5\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.778487 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-ovndb-tls-certs\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.778547 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-public-tls-certs\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.778578 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-internal-tls-certs\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.778637 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-combined-ca-bundle\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.784698 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-public-tls-certs\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.786419 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-internal-tls-certs\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.787231 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-ovndb-tls-certs\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.788939 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-httpd-config\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.789682 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-config\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.798325 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6278eaca-e01e-4eb1-9c7f-e12fc399606a-combined-ca-bundle\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.800683 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz7x5\" (UniqueName: \"kubernetes.io/projected/6278eaca-e01e-4eb1-9c7f-e12fc399606a-kube-api-access-hz7x5\") pod \"neutron-6854dd75d7-6cgpn\" (UID: \"6278eaca-e01e-4eb1-9c7f-e12fc399606a\") " pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.823301 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" path="/var/lib/kubelet/pods/51ed98c9-9585-44d2-a913-ebdcfa04ac53/volumes" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.824142 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee5a8249-034c-4e5a-b562-f98954678ca0" path="/var/lib/kubelet/pods/ee5a8249-034c-4e5a-b562-f98954678ca0/volumes" Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.824758 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:55:20 crc kubenswrapper[4913]: W1001 12:55:20.836668 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3708ac45_f021_44ed_8c85_e34c2ed73241.slice/crio-7a585055a0b580ad8a5b4a8d36892ca55396666443c215ab84b0f7712f782b3a WatchSource:0}: Error finding container 7a585055a0b580ad8a5b4a8d36892ca55396666443c215ab84b0f7712f782b3a: Status 404 returned error can't find the container with id 7a585055a0b580ad8a5b4a8d36892ca55396666443c215ab84b0f7712f782b3a Oct 01 12:55:20 crc kubenswrapper[4913]: I1001 12:55:20.947594 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:21 crc kubenswrapper[4913]: I1001 12:55:21.115367 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c0ec-account-create-tz4m8"] Oct 01 12:55:21 crc kubenswrapper[4913]: I1001 12:55:21.116894 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c0ec-account-create-tz4m8" Oct 01 12:55:21 crc kubenswrapper[4913]: I1001 12:55:21.119501 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 01 12:55:21 crc kubenswrapper[4913]: I1001 12:55:21.123469 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c0ec-account-create-tz4m8"] Oct 01 12:55:21 crc kubenswrapper[4913]: I1001 12:55:21.193162 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh7sn\" (UniqueName: \"kubernetes.io/projected/639c9d26-ffcf-4cd4-989c-de3f777ec5ea-kube-api-access-hh7sn\") pod \"nova-api-c0ec-account-create-tz4m8\" (UID: \"639c9d26-ffcf-4cd4-989c-de3f777ec5ea\") " pod="openstack/nova-api-c0ec-account-create-tz4m8" Oct 01 12:55:21 crc kubenswrapper[4913]: I1001 12:55:21.294903 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh7sn\" (UniqueName: \"kubernetes.io/projected/639c9d26-ffcf-4cd4-989c-de3f777ec5ea-kube-api-access-hh7sn\") pod \"nova-api-c0ec-account-create-tz4m8\" (UID: \"639c9d26-ffcf-4cd4-989c-de3f777ec5ea\") " pod="openstack/nova-api-c0ec-account-create-tz4m8" Oct 01 12:55:21 crc kubenswrapper[4913]: I1001 12:55:21.322408 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh7sn\" (UniqueName: \"kubernetes.io/projected/639c9d26-ffcf-4cd4-989c-de3f777ec5ea-kube-api-access-hh7sn\") pod \"nova-api-c0ec-account-create-tz4m8\" (UID: \"639c9d26-ffcf-4cd4-989c-de3f777ec5ea\") " pod="openstack/nova-api-c0ec-account-create-tz4m8" Oct 01 12:55:21 crc kubenswrapper[4913]: I1001 12:55:21.325163 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3708ac45-f021-44ed-8c85-e34c2ed73241","Type":"ContainerStarted","Data":"7a585055a0b580ad8a5b4a8d36892ca55396666443c215ab84b0f7712f782b3a"} Oct 01 12:55:21 crc kubenswrapper[4913]: I1001 12:55:21.337963 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72edfef8-0b60-42d5-a649-398d42c9daa7","Type":"ContainerStarted","Data":"c1b5178ed2df16c415bf4b28626249152a41108b30a58ea928d1c662fff05ec8"} Oct 01 12:55:21 crc kubenswrapper[4913]: I1001 12:55:21.441725 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c0ec-account-create-tz4m8" Oct 01 12:55:21 crc kubenswrapper[4913]: I1001 12:55:21.568871 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6854dd75d7-6cgpn"] Oct 01 12:55:22 crc kubenswrapper[4913]: I1001 12:55:22.068843 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c0ec-account-create-tz4m8"] Oct 01 12:55:22 crc kubenswrapper[4913]: W1001 12:55:22.125307 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod639c9d26_ffcf_4cd4_989c_de3f777ec5ea.slice/crio-d103305a82f746f3ac5e2336b122550710abe3ff4b0419d0b8a5b5b329755cc1 WatchSource:0}: Error finding container d103305a82f746f3ac5e2336b122550710abe3ff4b0419d0b8a5b5b329755cc1: Status 404 returned error can't find the container with id d103305a82f746f3ac5e2336b122550710abe3ff4b0419d0b8a5b5b329755cc1 Oct 01 12:55:22 crc kubenswrapper[4913]: I1001 12:55:22.361965 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3708ac45-f021-44ed-8c85-e34c2ed73241","Type":"ContainerStarted","Data":"724535cbbf25179f51d757a6ef85e2e5198d5fc3e4618c9344fb76182b5c2e01"} Oct 01 12:55:22 crc kubenswrapper[4913]: I1001 12:55:22.368656 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72edfef8-0b60-42d5-a649-398d42c9daa7","Type":"ContainerStarted","Data":"5869f8ccac15db46a659358570e01bbab7e658e85ad6ede1ac0980d5d1618fc0"} Oct 01 12:55:22 crc kubenswrapper[4913]: I1001 12:55:22.369913 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c0ec-account-create-tz4m8" event={"ID":"639c9d26-ffcf-4cd4-989c-de3f777ec5ea","Type":"ContainerStarted","Data":"d103305a82f746f3ac5e2336b122550710abe3ff4b0419d0b8a5b5b329755cc1"} Oct 01 12:55:22 crc kubenswrapper[4913]: I1001 12:55:22.371360 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6854dd75d7-6cgpn" event={"ID":"6278eaca-e01e-4eb1-9c7f-e12fc399606a","Type":"ContainerStarted","Data":"a828cad5e41cabe0ac01340d4653519ea30a9b31612df4e7c7a7d1237c7def99"} Oct 01 12:55:22 crc kubenswrapper[4913]: I1001 12:55:22.371393 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6854dd75d7-6cgpn" event={"ID":"6278eaca-e01e-4eb1-9c7f-e12fc399606a","Type":"ContainerStarted","Data":"89f6feb9c0150ead363d78b75a6d244dd2d4417bdd2cfa08f3857093178f168e"} Oct 01 12:55:23 crc kubenswrapper[4913]: I1001 12:55:23.355515 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 01 12:55:23 crc kubenswrapper[4913]: I1001 12:55:23.457228 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3708ac45-f021-44ed-8c85-e34c2ed73241","Type":"ContainerStarted","Data":"e030052d934e49b57096b9eb1a4e8ccaaf3bca73f19e8bb412ccf940bad95180"} Oct 01 12:55:23 crc kubenswrapper[4913]: I1001 12:55:23.467189 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72edfef8-0b60-42d5-a649-398d42c9daa7","Type":"ContainerStarted","Data":"dbe02e600b5c4897f2ed2e5e1dcbfee99a17412f32d469611168cef5dc81a29a"} Oct 01 12:55:23 crc kubenswrapper[4913]: I1001 12:55:23.473510 4913 generic.go:334] "Generic (PLEG): container finished" podID="639c9d26-ffcf-4cd4-989c-de3f777ec5ea" containerID="e35a032a728caed1c5f4271a94daf7c620a76e613f314cf9036f7111f731420f" exitCode=0 Oct 01 12:55:23 crc kubenswrapper[4913]: I1001 12:55:23.474341 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c0ec-account-create-tz4m8" event={"ID":"639c9d26-ffcf-4cd4-989c-de3f777ec5ea","Type":"ContainerDied","Data":"e35a032a728caed1c5f4271a94daf7c620a76e613f314cf9036f7111f731420f"} Oct 01 12:55:23 crc kubenswrapper[4913]: I1001 12:55:23.497280 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6854dd75d7-6cgpn" event={"ID":"6278eaca-e01e-4eb1-9c7f-e12fc399606a","Type":"ContainerStarted","Data":"eefa4a96ae6ebffd6d4493738a2df2a8cab0b143b57a323b3c96f7bd21243425"} Oct 01 12:55:23 crc kubenswrapper[4913]: I1001 12:55:23.497626 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:23 crc kubenswrapper[4913]: I1001 12:55:23.497692 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.497671661 podStartE2EDuration="4.497671661s" podCreationTimestamp="2025-10-01 12:55:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:23.481746463 +0000 UTC m=+1055.385222051" watchObservedRunningTime="2025-10-01 12:55:23.497671661 +0000 UTC m=+1055.401147239" Oct 01 12:55:23 crc kubenswrapper[4913]: I1001 12:55:23.535948 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6854dd75d7-6cgpn" podStartSLOduration=3.53593014 podStartE2EDuration="3.53593014s" podCreationTimestamp="2025-10-01 12:55:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:23.521579226 +0000 UTC m=+1055.425054824" watchObservedRunningTime="2025-10-01 12:55:23.53593014 +0000 UTC m=+1055.439405718" Oct 01 12:55:24 crc kubenswrapper[4913]: I1001 12:55:24.402538 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:24 crc kubenswrapper[4913]: I1001 12:55:24.803831 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55679b7754-wczpf" Oct 01 12:55:24 crc kubenswrapper[4913]: I1001 12:55:24.887363 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-778944cb96-tm27q"] Oct 01 12:55:24 crc kubenswrapper[4913]: I1001 12:55:24.887568 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-778944cb96-tm27q" podUID="fe92d074-cafa-4a38-a8a3-f49068a3366d" containerName="barbican-api-log" containerID="cri-o://9c0e67c0b49a361639069473d3506f3057f6038adb8989064f9c6907092d999d" gracePeriod=30 Oct 01 12:55:24 crc kubenswrapper[4913]: I1001 12:55:24.887933 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-778944cb96-tm27q" podUID="fe92d074-cafa-4a38-a8a3-f49068a3366d" containerName="barbican-api" containerID="cri-o://dad074d9b09f0dcf53d2ce6062cc6b470772e246bdb6bf5e4e2e77f88c5e4ff6" gracePeriod=30 Oct 01 12:55:24 crc kubenswrapper[4913]: I1001 12:55:24.961536 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c0ec-account-create-tz4m8" Oct 01 12:55:25 crc kubenswrapper[4913]: I1001 12:55:25.086456 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh7sn\" (UniqueName: \"kubernetes.io/projected/639c9d26-ffcf-4cd4-989c-de3f777ec5ea-kube-api-access-hh7sn\") pod \"639c9d26-ffcf-4cd4-989c-de3f777ec5ea\" (UID: \"639c9d26-ffcf-4cd4-989c-de3f777ec5ea\") " Oct 01 12:55:25 crc kubenswrapper[4913]: I1001 12:55:25.092422 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/639c9d26-ffcf-4cd4-989c-de3f777ec5ea-kube-api-access-hh7sn" (OuterVolumeSpecName: "kube-api-access-hh7sn") pod "639c9d26-ffcf-4cd4-989c-de3f777ec5ea" (UID: "639c9d26-ffcf-4cd4-989c-de3f777ec5ea"). InnerVolumeSpecName "kube-api-access-hh7sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:25 crc kubenswrapper[4913]: I1001 12:55:25.188121 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh7sn\" (UniqueName: \"kubernetes.io/projected/639c9d26-ffcf-4cd4-989c-de3f777ec5ea-kube-api-access-hh7sn\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:25 crc kubenswrapper[4913]: I1001 12:55:25.240035 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 12:55:25 crc kubenswrapper[4913]: I1001 12:55:25.532444 4913 generic.go:334] "Generic (PLEG): container finished" podID="fe92d074-cafa-4a38-a8a3-f49068a3366d" containerID="9c0e67c0b49a361639069473d3506f3057f6038adb8989064f9c6907092d999d" exitCode=143 Oct 01 12:55:25 crc kubenswrapper[4913]: I1001 12:55:25.532507 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-778944cb96-tm27q" event={"ID":"fe92d074-cafa-4a38-a8a3-f49068a3366d","Type":"ContainerDied","Data":"9c0e67c0b49a361639069473d3506f3057f6038adb8989064f9c6907092d999d"} Oct 01 12:55:25 crc kubenswrapper[4913]: I1001 12:55:25.534555 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c0ec-account-create-tz4m8" event={"ID":"639c9d26-ffcf-4cd4-989c-de3f777ec5ea","Type":"ContainerDied","Data":"d103305a82f746f3ac5e2336b122550710abe3ff4b0419d0b8a5b5b329755cc1"} Oct 01 12:55:25 crc kubenswrapper[4913]: I1001 12:55:25.534580 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d103305a82f746f3ac5e2336b122550710abe3ff4b0419d0b8a5b5b329755cc1" Oct 01 12:55:25 crc kubenswrapper[4913]: I1001 12:55:25.534622 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c0ec-account-create-tz4m8" Oct 01 12:55:26 crc kubenswrapper[4913]: I1001 12:55:26.545936 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72edfef8-0b60-42d5-a649-398d42c9daa7","Type":"ContainerStarted","Data":"c2e25c4896c7a15e6821d2984433223a86e2bc22f14c1c5e685387ba332e0c52"} Oct 01 12:55:26 crc kubenswrapper[4913]: I1001 12:55:26.546123 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="ceilometer-central-agent" containerID="cri-o://c1b5178ed2df16c415bf4b28626249152a41108b30a58ea928d1c662fff05ec8" gracePeriod=30 Oct 01 12:55:26 crc kubenswrapper[4913]: I1001 12:55:26.547092 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="proxy-httpd" containerID="cri-o://c2e25c4896c7a15e6821d2984433223a86e2bc22f14c1c5e685387ba332e0c52" gracePeriod=30 Oct 01 12:55:26 crc kubenswrapper[4913]: I1001 12:55:26.547150 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="ceilometer-notification-agent" containerID="cri-o://5869f8ccac15db46a659358570e01bbab7e658e85ad6ede1ac0980d5d1618fc0" gracePeriod=30 Oct 01 12:55:26 crc kubenswrapper[4913]: I1001 12:55:26.547176 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="sg-core" containerID="cri-o://dbe02e600b5c4897f2ed2e5e1dcbfee99a17412f32d469611168cef5dc81a29a" gracePeriod=30 Oct 01 12:55:26 crc kubenswrapper[4913]: I1001 12:55:26.548280 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:55:26 crc kubenswrapper[4913]: I1001 12:55:26.572210 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.933975887 podStartE2EDuration="8.572192935s" podCreationTimestamp="2025-10-01 12:55:18 +0000 UTC" firstStartedPulling="2025-10-01 12:55:20.024001131 +0000 UTC m=+1051.927476709" lastFinishedPulling="2025-10-01 12:55:25.662218179 +0000 UTC m=+1057.565693757" observedRunningTime="2025-10-01 12:55:26.569237594 +0000 UTC m=+1058.472713222" watchObservedRunningTime="2025-10-01 12:55:26.572192935 +0000 UTC m=+1058.475668513" Oct 01 12:55:27 crc kubenswrapper[4913]: I1001 12:55:27.557627 4913 generic.go:334] "Generic (PLEG): container finished" podID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerID="c2e25c4896c7a15e6821d2984433223a86e2bc22f14c1c5e685387ba332e0c52" exitCode=0 Oct 01 12:55:27 crc kubenswrapper[4913]: I1001 12:55:27.557918 4913 generic.go:334] "Generic (PLEG): container finished" podID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerID="dbe02e600b5c4897f2ed2e5e1dcbfee99a17412f32d469611168cef5dc81a29a" exitCode=2 Oct 01 12:55:27 crc kubenswrapper[4913]: I1001 12:55:27.557926 4913 generic.go:334] "Generic (PLEG): container finished" podID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerID="5869f8ccac15db46a659358570e01bbab7e658e85ad6ede1ac0980d5d1618fc0" exitCode=0 Oct 01 12:55:27 crc kubenswrapper[4913]: I1001 12:55:27.557933 4913 generic.go:334] "Generic (PLEG): container finished" podID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerID="c1b5178ed2df16c415bf4b28626249152a41108b30a58ea928d1c662fff05ec8" exitCode=0 Oct 01 12:55:27 crc kubenswrapper[4913]: I1001 12:55:27.557697 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72edfef8-0b60-42d5-a649-398d42c9daa7","Type":"ContainerDied","Data":"c2e25c4896c7a15e6821d2984433223a86e2bc22f14c1c5e685387ba332e0c52"} Oct 01 12:55:27 crc kubenswrapper[4913]: I1001 12:55:27.557962 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72edfef8-0b60-42d5-a649-398d42c9daa7","Type":"ContainerDied","Data":"dbe02e600b5c4897f2ed2e5e1dcbfee99a17412f32d469611168cef5dc81a29a"} Oct 01 12:55:27 crc kubenswrapper[4913]: I1001 12:55:27.557973 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72edfef8-0b60-42d5-a649-398d42c9daa7","Type":"ContainerDied","Data":"5869f8ccac15db46a659358570e01bbab7e658e85ad6ede1ac0980d5d1618fc0"} Oct 01 12:55:27 crc kubenswrapper[4913]: I1001 12:55:27.557982 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72edfef8-0b60-42d5-a649-398d42c9daa7","Type":"ContainerDied","Data":"c1b5178ed2df16c415bf4b28626249152a41108b30a58ea928d1c662fff05ec8"} Oct 01 12:55:27 crc kubenswrapper[4913]: I1001 12:55:27.744420 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:55:27 crc kubenswrapper[4913]: I1001 12:55:27.812603 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77bdb78dfc-7qtdk"] Oct 01 12:55:27 crc kubenswrapper[4913]: I1001 12:55:27.812828 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" podUID="60087e2e-0b0a-4a75-97be-912e06c0b17a" containerName="dnsmasq-dns" containerID="cri-o://ba348fb80160b49f753b98eea15c0b9164598e4a361122db72555e1cf88b64d5" gracePeriod=10 Oct 01 12:55:28 crc kubenswrapper[4913]: I1001 12:55:28.572388 4913 generic.go:334] "Generic (PLEG): container finished" podID="60087e2e-0b0a-4a75-97be-912e06c0b17a" containerID="ba348fb80160b49f753b98eea15c0b9164598e4a361122db72555e1cf88b64d5" exitCode=0 Oct 01 12:55:28 crc kubenswrapper[4913]: I1001 12:55:28.572785 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" event={"ID":"60087e2e-0b0a-4a75-97be-912e06c0b17a","Type":"ContainerDied","Data":"ba348fb80160b49f753b98eea15c0b9164598e4a361122db72555e1cf88b64d5"} Oct 01 12:55:28 crc kubenswrapper[4913]: I1001 12:55:28.576950 4913 generic.go:334] "Generic (PLEG): container finished" podID="fe92d074-cafa-4a38-a8a3-f49068a3366d" containerID="dad074d9b09f0dcf53d2ce6062cc6b470772e246bdb6bf5e4e2e77f88c5e4ff6" exitCode=0 Oct 01 12:55:28 crc kubenswrapper[4913]: I1001 12:55:28.576981 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-778944cb96-tm27q" event={"ID":"fe92d074-cafa-4a38-a8a3-f49068a3366d","Type":"ContainerDied","Data":"dad074d9b09f0dcf53d2ce6062cc6b470772e246bdb6bf5e4e2e77f88c5e4ff6"} Oct 01 12:55:29 crc kubenswrapper[4913]: I1001 12:55:29.563763 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-778944cb96-tm27q" podUID="fe92d074-cafa-4a38-a8a3-f49068a3366d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": dial tcp 10.217.0.155:9311: connect: connection refused" Oct 01 12:55:29 crc kubenswrapper[4913]: I1001 12:55:29.564019 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-778944cb96-tm27q" podUID="fe92d074-cafa-4a38-a8a3-f49068a3366d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": dial tcp 10.217.0.155:9311: connect: connection refused" Oct 01 12:55:30 crc kubenswrapper[4913]: I1001 12:55:30.493351 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 12:55:30 crc kubenswrapper[4913]: I1001 12:55:30.619976 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" podUID="60087e2e-0b0a-4a75-97be-912e06c0b17a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: connect: connection refused" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.436650 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.533860 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72edfef8-0b60-42d5-a649-398d42c9daa7-run-httpd\") pod \"72edfef8-0b60-42d5-a649-398d42c9daa7\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.533921 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-combined-ca-bundle\") pod \"72edfef8-0b60-42d5-a649-398d42c9daa7\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.533984 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcwst\" (UniqueName: \"kubernetes.io/projected/72edfef8-0b60-42d5-a649-398d42c9daa7-kube-api-access-hcwst\") pod \"72edfef8-0b60-42d5-a649-398d42c9daa7\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.534080 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-sg-core-conf-yaml\") pod \"72edfef8-0b60-42d5-a649-398d42c9daa7\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.534130 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-config-data\") pod \"72edfef8-0b60-42d5-a649-398d42c9daa7\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.534213 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72edfef8-0b60-42d5-a649-398d42c9daa7-log-httpd\") pod \"72edfef8-0b60-42d5-a649-398d42c9daa7\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.534363 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-scripts\") pod \"72edfef8-0b60-42d5-a649-398d42c9daa7\" (UID: \"72edfef8-0b60-42d5-a649-398d42c9daa7\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.534742 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72edfef8-0b60-42d5-a649-398d42c9daa7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "72edfef8-0b60-42d5-a649-398d42c9daa7" (UID: "72edfef8-0b60-42d5-a649-398d42c9daa7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.534854 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72edfef8-0b60-42d5-a649-398d42c9daa7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "72edfef8-0b60-42d5-a649-398d42c9daa7" (UID: "72edfef8-0b60-42d5-a649-398d42c9daa7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.535110 4913 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72edfef8-0b60-42d5-a649-398d42c9daa7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.535131 4913 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72edfef8-0b60-42d5-a649-398d42c9daa7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.539229 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-scripts" (OuterVolumeSpecName: "scripts") pod "72edfef8-0b60-42d5-a649-398d42c9daa7" (UID: "72edfef8-0b60-42d5-a649-398d42c9daa7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.551353 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72edfef8-0b60-42d5-a649-398d42c9daa7-kube-api-access-hcwst" (OuterVolumeSpecName: "kube-api-access-hcwst") pod "72edfef8-0b60-42d5-a649-398d42c9daa7" (UID: "72edfef8-0b60-42d5-a649-398d42c9daa7"). InnerVolumeSpecName "kube-api-access-hcwst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.565820 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "72edfef8-0b60-42d5-a649-398d42c9daa7" (UID: "72edfef8-0b60-42d5-a649-398d42c9daa7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.566108 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.589017 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.636516 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-combined-ca-bundle\") pod \"fe92d074-cafa-4a38-a8a3-f49068a3366d\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.636605 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-config\") pod \"60087e2e-0b0a-4a75-97be-912e06c0b17a\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.636702 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-config-data\") pod \"fe92d074-cafa-4a38-a8a3-f49068a3366d\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.636781 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-ovsdbserver-nb\") pod \"60087e2e-0b0a-4a75-97be-912e06c0b17a\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.636806 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qdbw\" (UniqueName: \"kubernetes.io/projected/fe92d074-cafa-4a38-a8a3-f49068a3366d-kube-api-access-6qdbw\") pod \"fe92d074-cafa-4a38-a8a3-f49068a3366d\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.636844 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-ovsdbserver-sb\") pod \"60087e2e-0b0a-4a75-97be-912e06c0b17a\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.636874 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-config-data-custom\") pod \"fe92d074-cafa-4a38-a8a3-f49068a3366d\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.636904 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-dns-svc\") pod \"60087e2e-0b0a-4a75-97be-912e06c0b17a\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.636977 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrvzh\" (UniqueName: \"kubernetes.io/projected/60087e2e-0b0a-4a75-97be-912e06c0b17a-kube-api-access-mrvzh\") pod \"60087e2e-0b0a-4a75-97be-912e06c0b17a\" (UID: \"60087e2e-0b0a-4a75-97be-912e06c0b17a\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.637069 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe92d074-cafa-4a38-a8a3-f49068a3366d-logs\") pod \"fe92d074-cafa-4a38-a8a3-f49068a3366d\" (UID: \"fe92d074-cafa-4a38-a8a3-f49068a3366d\") " Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.637577 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcwst\" (UniqueName: \"kubernetes.io/projected/72edfef8-0b60-42d5-a649-398d42c9daa7-kube-api-access-hcwst\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.637602 4913 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.637614 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.637958 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe92d074-cafa-4a38-a8a3-f49068a3366d-logs" (OuterVolumeSpecName: "logs") pod "fe92d074-cafa-4a38-a8a3-f49068a3366d" (UID: "fe92d074-cafa-4a38-a8a3-f49068a3366d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.639321 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72edfef8-0b60-42d5-a649-398d42c9daa7","Type":"ContainerDied","Data":"6c3bd8c3cafefe468b49a78a9e665af2169727471094c793563d741459baea0c"} Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.639377 4913 scope.go:117] "RemoveContainer" containerID="c2e25c4896c7a15e6821d2984433223a86e2bc22f14c1c5e685387ba332e0c52" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.639521 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.645435 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72edfef8-0b60-42d5-a649-398d42c9daa7" (UID: "72edfef8-0b60-42d5-a649-398d42c9daa7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.646599 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60087e2e-0b0a-4a75-97be-912e06c0b17a-kube-api-access-mrvzh" (OuterVolumeSpecName: "kube-api-access-mrvzh") pod "60087e2e-0b0a-4a75-97be-912e06c0b17a" (UID: "60087e2e-0b0a-4a75-97be-912e06c0b17a"). InnerVolumeSpecName "kube-api-access-mrvzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.652083 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t5ghk" event={"ID":"ba0e4f52-1a20-4443-98c4-03620eec847f","Type":"ContainerStarted","Data":"527e1694afd53727f343962c8ca2f8b91a5e17ef446bce052be1a32fa1bc7524"} Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.655753 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fe92d074-cafa-4a38-a8a3-f49068a3366d" (UID: "fe92d074-cafa-4a38-a8a3-f49068a3366d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.657097 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe92d074-cafa-4a38-a8a3-f49068a3366d-kube-api-access-6qdbw" (OuterVolumeSpecName: "kube-api-access-6qdbw") pod "fe92d074-cafa-4a38-a8a3-f49068a3366d" (UID: "fe92d074-cafa-4a38-a8a3-f49068a3366d"). InnerVolumeSpecName "kube-api-access-6qdbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.658036 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-778944cb96-tm27q" event={"ID":"fe92d074-cafa-4a38-a8a3-f49068a3366d","Type":"ContainerDied","Data":"a17f21edf05c5dde966137a9382c34d6ae3901237c4fca42bb0308ea2fddc496"} Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.658115 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-778944cb96-tm27q" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.660936 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" event={"ID":"60087e2e-0b0a-4a75-97be-912e06c0b17a","Type":"ContainerDied","Data":"e1a8d0833f2f0740453843a3319514f7ad8a00bec93652aaca490aec89ffe5d9"} Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.661021 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77bdb78dfc-7qtdk" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.682179 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-config-data" (OuterVolumeSpecName: "config-data") pod "72edfef8-0b60-42d5-a649-398d42c9daa7" (UID: "72edfef8-0b60-42d5-a649-398d42c9daa7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.683247 4913 scope.go:117] "RemoveContainer" containerID="dbe02e600b5c4897f2ed2e5e1dcbfee99a17412f32d469611168cef5dc81a29a" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.689060 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe92d074-cafa-4a38-a8a3-f49068a3366d" (UID: "fe92d074-cafa-4a38-a8a3-f49068a3366d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.691191 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-t5ghk" podStartSLOduration=2.1593252 podStartE2EDuration="15.691112781s" podCreationTimestamp="2025-10-01 12:55:16 +0000 UTC" firstStartedPulling="2025-10-01 12:55:17.561381819 +0000 UTC m=+1049.464857387" lastFinishedPulling="2025-10-01 12:55:31.09316939 +0000 UTC m=+1062.996644968" observedRunningTime="2025-10-01 12:55:31.674131265 +0000 UTC m=+1063.577606863" watchObservedRunningTime="2025-10-01 12:55:31.691112781 +0000 UTC m=+1063.594588359" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.716219 4913 scope.go:117] "RemoveContainer" containerID="5869f8ccac15db46a659358570e01bbab7e658e85ad6ede1ac0980d5d1618fc0" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.718166 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "60087e2e-0b0a-4a75-97be-912e06c0b17a" (UID: "60087e2e-0b0a-4a75-97be-912e06c0b17a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.718178 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "60087e2e-0b0a-4a75-97be-912e06c0b17a" (UID: "60087e2e-0b0a-4a75-97be-912e06c0b17a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.721040 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "60087e2e-0b0a-4a75-97be-912e06c0b17a" (UID: "60087e2e-0b0a-4a75-97be-912e06c0b17a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.722442 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-config" (OuterVolumeSpecName: "config") pod "60087e2e-0b0a-4a75-97be-912e06c0b17a" (UID: "60087e2e-0b0a-4a75-97be-912e06c0b17a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.740396 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.740461 4913 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.740480 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.740509 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrvzh\" (UniqueName: \"kubernetes.io/projected/60087e2e-0b0a-4a75-97be-912e06c0b17a-kube-api-access-mrvzh\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.740526 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.740538 4913 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe92d074-cafa-4a38-a8a3-f49068a3366d-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.740551 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.740565 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.740579 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72edfef8-0b60-42d5-a649-398d42c9daa7-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.740590 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60087e2e-0b0a-4a75-97be-912e06c0b17a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.740602 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qdbw\" (UniqueName: \"kubernetes.io/projected/fe92d074-cafa-4a38-a8a3-f49068a3366d-kube-api-access-6qdbw\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.743237 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-config-data" (OuterVolumeSpecName: "config-data") pod "fe92d074-cafa-4a38-a8a3-f49068a3366d" (UID: "fe92d074-cafa-4a38-a8a3-f49068a3366d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.750726 4913 scope.go:117] "RemoveContainer" containerID="c1b5178ed2df16c415bf4b28626249152a41108b30a58ea928d1c662fff05ec8" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.781190 4913 scope.go:117] "RemoveContainer" containerID="dad074d9b09f0dcf53d2ce6062cc6b470772e246bdb6bf5e4e2e77f88c5e4ff6" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.801379 4913 scope.go:117] "RemoveContainer" containerID="9c0e67c0b49a361639069473d3506f3057f6038adb8989064f9c6907092d999d" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.826962 4913 scope.go:117] "RemoveContainer" containerID="ba348fb80160b49f753b98eea15c0b9164598e4a361122db72555e1cf88b64d5" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.842418 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe92d074-cafa-4a38-a8a3-f49068a3366d-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.844781 4913 scope.go:117] "RemoveContainer" containerID="6173fd595b3f63b80b92d1efe7011d693c1aafa543de6b8df47e1603e6d83636" Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.969948 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:31 crc kubenswrapper[4913]: I1001 12:55:31.982986 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.005060 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-778944cb96-tm27q"] Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.014143 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-778944cb96-tm27q"] Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028135 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:32 crc kubenswrapper[4913]: E1001 12:55:32.028465 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="ceilometer-notification-agent" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028477 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="ceilometer-notification-agent" Oct 01 12:55:32 crc kubenswrapper[4913]: E1001 12:55:32.028487 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60087e2e-0b0a-4a75-97be-912e06c0b17a" containerName="dnsmasq-dns" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028492 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="60087e2e-0b0a-4a75-97be-912e06c0b17a" containerName="dnsmasq-dns" Oct 01 12:55:32 crc kubenswrapper[4913]: E1001 12:55:32.028512 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="ceilometer-central-agent" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028519 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="ceilometer-central-agent" Oct 01 12:55:32 crc kubenswrapper[4913]: E1001 12:55:32.028527 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe92d074-cafa-4a38-a8a3-f49068a3366d" containerName="barbican-api-log" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028533 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe92d074-cafa-4a38-a8a3-f49068a3366d" containerName="barbican-api-log" Oct 01 12:55:32 crc kubenswrapper[4913]: E1001 12:55:32.028542 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="sg-core" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028548 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="sg-core" Oct 01 12:55:32 crc kubenswrapper[4913]: E1001 12:55:32.028559 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="proxy-httpd" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028564 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="proxy-httpd" Oct 01 12:55:32 crc kubenswrapper[4913]: E1001 12:55:32.028574 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60087e2e-0b0a-4a75-97be-912e06c0b17a" containerName="init" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028579 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="60087e2e-0b0a-4a75-97be-912e06c0b17a" containerName="init" Oct 01 12:55:32 crc kubenswrapper[4913]: E1001 12:55:32.028590 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe92d074-cafa-4a38-a8a3-f49068a3366d" containerName="barbican-api" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028595 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe92d074-cafa-4a38-a8a3-f49068a3366d" containerName="barbican-api" Oct 01 12:55:32 crc kubenswrapper[4913]: E1001 12:55:32.028614 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639c9d26-ffcf-4cd4-989c-de3f777ec5ea" containerName="mariadb-account-create" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028620 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="639c9d26-ffcf-4cd4-989c-de3f777ec5ea" containerName="mariadb-account-create" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028768 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="proxy-httpd" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028778 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="639c9d26-ffcf-4cd4-989c-de3f777ec5ea" containerName="mariadb-account-create" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028786 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe92d074-cafa-4a38-a8a3-f49068a3366d" containerName="barbican-api-log" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028793 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="60087e2e-0b0a-4a75-97be-912e06c0b17a" containerName="dnsmasq-dns" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028804 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="sg-core" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028817 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe92d074-cafa-4a38-a8a3-f49068a3366d" containerName="barbican-api" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028830 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="ceilometer-central-agent" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.028837 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" containerName="ceilometer-notification-agent" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.030215 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.038252 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77bdb78dfc-7qtdk"] Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.038916 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.039100 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.045328 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.045445 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af94082-eaf8-4d0f-ac71-d7a3055539a0-run-httpd\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.045485 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.045500 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-config-data\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.045576 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n85n\" (UniqueName: \"kubernetes.io/projected/4af94082-eaf8-4d0f-ac71-d7a3055539a0-kube-api-access-7n85n\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.045610 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af94082-eaf8-4d0f-ac71-d7a3055539a0-log-httpd\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.045689 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-scripts\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.051397 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77bdb78dfc-7qtdk"] Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.054851 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.147230 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.147302 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-config-data\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.147327 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n85n\" (UniqueName: \"kubernetes.io/projected/4af94082-eaf8-4d0f-ac71-d7a3055539a0-kube-api-access-7n85n\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.147346 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af94082-eaf8-4d0f-ac71-d7a3055539a0-log-httpd\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.147396 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-scripts\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.147453 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.147560 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af94082-eaf8-4d0f-ac71-d7a3055539a0-run-httpd\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.148158 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af94082-eaf8-4d0f-ac71-d7a3055539a0-log-httpd\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.148209 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af94082-eaf8-4d0f-ac71-d7a3055539a0-run-httpd\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.151132 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.151176 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-scripts\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.153866 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.155222 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-config-data\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.166585 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n85n\" (UniqueName: \"kubernetes.io/projected/4af94082-eaf8-4d0f-ac71-d7a3055539a0-kube-api-access-7n85n\") pod \"ceilometer-0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.394483 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.508396 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.823484 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60087e2e-0b0a-4a75-97be-912e06c0b17a" path="/var/lib/kubelet/pods/60087e2e-0b0a-4a75-97be-912e06c0b17a/volumes" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.825141 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72edfef8-0b60-42d5-a649-398d42c9daa7" path="/var/lib/kubelet/pods/72edfef8-0b60-42d5-a649-398d42c9daa7/volumes" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.826150 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe92d074-cafa-4a38-a8a3-f49068a3366d" path="/var/lib/kubelet/pods/fe92d074-cafa-4a38-a8a3-f49068a3366d/volumes" Oct 01 12:55:32 crc kubenswrapper[4913]: I1001 12:55:32.873925 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:33 crc kubenswrapper[4913]: I1001 12:55:33.689727 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af94082-eaf8-4d0f-ac71-d7a3055539a0","Type":"ContainerStarted","Data":"897227f6bb01635960ef96780c51c66f66615973ed3ef9573ec85705546430ae"} Oct 01 12:55:33 crc kubenswrapper[4913]: I1001 12:55:33.690012 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af94082-eaf8-4d0f-ac71-d7a3055539a0","Type":"ContainerStarted","Data":"ffe691b5610b12eb6857b3092d3e95c899761fe3643d56745f50d42d6a6b153c"} Oct 01 12:55:34 crc kubenswrapper[4913]: I1001 12:55:34.701519 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af94082-eaf8-4d0f-ac71-d7a3055539a0","Type":"ContainerStarted","Data":"c61c773cf73275c4fed09041634a871e976a4c3f872a7e9e6662ab51e65aa447"} Oct 01 12:55:35 crc kubenswrapper[4913]: I1001 12:55:35.710915 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af94082-eaf8-4d0f-ac71-d7a3055539a0","Type":"ContainerStarted","Data":"08900c922ee4b2ea76d2a7099749b1b91d98e31bc1950527e186364190f5fa5a"} Oct 01 12:55:37 crc kubenswrapper[4913]: I1001 12:55:37.728124 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af94082-eaf8-4d0f-ac71-d7a3055539a0","Type":"ContainerStarted","Data":"72761cafd7c7bb26f5394525c63ae1b72c0ceb118243c5315864e7509da12644"} Oct 01 12:55:37 crc kubenswrapper[4913]: I1001 12:55:37.728390 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="ceilometer-central-agent" containerID="cri-o://897227f6bb01635960ef96780c51c66f66615973ed3ef9573ec85705546430ae" gracePeriod=30 Oct 01 12:55:37 crc kubenswrapper[4913]: I1001 12:55:37.728462 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="proxy-httpd" containerID="cri-o://72761cafd7c7bb26f5394525c63ae1b72c0ceb118243c5315864e7509da12644" gracePeriod=30 Oct 01 12:55:37 crc kubenswrapper[4913]: I1001 12:55:37.728508 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="ceilometer-notification-agent" containerID="cri-o://c61c773cf73275c4fed09041634a871e976a4c3f872a7e9e6662ab51e65aa447" gracePeriod=30 Oct 01 12:55:37 crc kubenswrapper[4913]: I1001 12:55:37.728462 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="sg-core" containerID="cri-o://08900c922ee4b2ea76d2a7099749b1b91d98e31bc1950527e186364190f5fa5a" gracePeriod=30 Oct 01 12:55:37 crc kubenswrapper[4913]: I1001 12:55:37.728637 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:55:37 crc kubenswrapper[4913]: I1001 12:55:37.760134 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.141823375 podStartE2EDuration="6.760114504s" podCreationTimestamp="2025-10-01 12:55:31 +0000 UTC" firstStartedPulling="2025-10-01 12:55:32.871376035 +0000 UTC m=+1064.774851613" lastFinishedPulling="2025-10-01 12:55:36.489667164 +0000 UTC m=+1068.393142742" observedRunningTime="2025-10-01 12:55:37.752485094 +0000 UTC m=+1069.655960692" watchObservedRunningTime="2025-10-01 12:55:37.760114504 +0000 UTC m=+1069.663590082" Oct 01 12:55:38 crc kubenswrapper[4913]: I1001 12:55:38.743797 4913 generic.go:334] "Generic (PLEG): container finished" podID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerID="72761cafd7c7bb26f5394525c63ae1b72c0ceb118243c5315864e7509da12644" exitCode=0 Oct 01 12:55:38 crc kubenswrapper[4913]: I1001 12:55:38.744150 4913 generic.go:334] "Generic (PLEG): container finished" podID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerID="08900c922ee4b2ea76d2a7099749b1b91d98e31bc1950527e186364190f5fa5a" exitCode=2 Oct 01 12:55:38 crc kubenswrapper[4913]: I1001 12:55:38.744169 4913 generic.go:334] "Generic (PLEG): container finished" podID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerID="c61c773cf73275c4fed09041634a871e976a4c3f872a7e9e6662ab51e65aa447" exitCode=0 Oct 01 12:55:38 crc kubenswrapper[4913]: I1001 12:55:38.743842 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af94082-eaf8-4d0f-ac71-d7a3055539a0","Type":"ContainerDied","Data":"72761cafd7c7bb26f5394525c63ae1b72c0ceb118243c5315864e7509da12644"} Oct 01 12:55:38 crc kubenswrapper[4913]: I1001 12:55:38.744220 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af94082-eaf8-4d0f-ac71-d7a3055539a0","Type":"ContainerDied","Data":"08900c922ee4b2ea76d2a7099749b1b91d98e31bc1950527e186364190f5fa5a"} Oct 01 12:55:38 crc kubenswrapper[4913]: I1001 12:55:38.744242 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af94082-eaf8-4d0f-ac71-d7a3055539a0","Type":"ContainerDied","Data":"c61c773cf73275c4fed09041634a871e976a4c3f872a7e9e6662ab51e65aa447"} Oct 01 12:55:39 crc kubenswrapper[4913]: I1001 12:55:39.761188 4913 generic.go:334] "Generic (PLEG): container finished" podID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerID="897227f6bb01635960ef96780c51c66f66615973ed3ef9573ec85705546430ae" exitCode=0 Oct 01 12:55:39 crc kubenswrapper[4913]: I1001 12:55:39.761245 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af94082-eaf8-4d0f-ac71-d7a3055539a0","Type":"ContainerDied","Data":"897227f6bb01635960ef96780c51c66f66615973ed3ef9573ec85705546430ae"} Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.083885 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.083949 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.102818 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.294880 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af94082-eaf8-4d0f-ac71-d7a3055539a0-log-httpd\") pod \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.295034 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-sg-core-conf-yaml\") pod \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.295099 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-scripts\") pod \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.295162 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-config-data\") pod \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.295314 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-combined-ca-bundle\") pod \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.295423 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af94082-eaf8-4d0f-ac71-d7a3055539a0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4af94082-eaf8-4d0f-ac71-d7a3055539a0" (UID: "4af94082-eaf8-4d0f-ac71-d7a3055539a0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.295504 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af94082-eaf8-4d0f-ac71-d7a3055539a0-run-httpd\") pod \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.295774 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n85n\" (UniqueName: \"kubernetes.io/projected/4af94082-eaf8-4d0f-ac71-d7a3055539a0-kube-api-access-7n85n\") pod \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\" (UID: \"4af94082-eaf8-4d0f-ac71-d7a3055539a0\") " Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.296125 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af94082-eaf8-4d0f-ac71-d7a3055539a0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4af94082-eaf8-4d0f-ac71-d7a3055539a0" (UID: "4af94082-eaf8-4d0f-ac71-d7a3055539a0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.299165 4913 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af94082-eaf8-4d0f-ac71-d7a3055539a0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.299428 4913 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af94082-eaf8-4d0f-ac71-d7a3055539a0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.302610 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4af94082-eaf8-4d0f-ac71-d7a3055539a0-kube-api-access-7n85n" (OuterVolumeSpecName: "kube-api-access-7n85n") pod "4af94082-eaf8-4d0f-ac71-d7a3055539a0" (UID: "4af94082-eaf8-4d0f-ac71-d7a3055539a0"). InnerVolumeSpecName "kube-api-access-7n85n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.303105 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-scripts" (OuterVolumeSpecName: "scripts") pod "4af94082-eaf8-4d0f-ac71-d7a3055539a0" (UID: "4af94082-eaf8-4d0f-ac71-d7a3055539a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.323442 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4af94082-eaf8-4d0f-ac71-d7a3055539a0" (UID: "4af94082-eaf8-4d0f-ac71-d7a3055539a0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.359536 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4af94082-eaf8-4d0f-ac71-d7a3055539a0" (UID: "4af94082-eaf8-4d0f-ac71-d7a3055539a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.389451 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-config-data" (OuterVolumeSpecName: "config-data") pod "4af94082-eaf8-4d0f-ac71-d7a3055539a0" (UID: "4af94082-eaf8-4d0f-ac71-d7a3055539a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.400776 4913 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.400847 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.401036 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.401045 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af94082-eaf8-4d0f-ac71-d7a3055539a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.401055 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n85n\" (UniqueName: \"kubernetes.io/projected/4af94082-eaf8-4d0f-ac71-d7a3055539a0-kube-api-access-7n85n\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.772645 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af94082-eaf8-4d0f-ac71-d7a3055539a0","Type":"ContainerDied","Data":"ffe691b5610b12eb6857b3092d3e95c899761fe3643d56745f50d42d6a6b153c"} Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.772707 4913 scope.go:117] "RemoveContainer" containerID="72761cafd7c7bb26f5394525c63ae1b72c0ceb118243c5315864e7509da12644" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.772869 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.793411 4913 scope.go:117] "RemoveContainer" containerID="08900c922ee4b2ea76d2a7099749b1b91d98e31bc1950527e186364190f5fa5a" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.820478 4913 scope.go:117] "RemoveContainer" containerID="c61c773cf73275c4fed09041634a871e976a4c3f872a7e9e6662ab51e65aa447" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.838547 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.856678 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.871056 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:40 crc kubenswrapper[4913]: E1001 12:55:40.871670 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="ceilometer-central-agent" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.871694 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="ceilometer-central-agent" Oct 01 12:55:40 crc kubenswrapper[4913]: E1001 12:55:40.871710 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="proxy-httpd" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.871720 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="proxy-httpd" Oct 01 12:55:40 crc kubenswrapper[4913]: E1001 12:55:40.871904 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="sg-core" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.871915 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="sg-core" Oct 01 12:55:40 crc kubenswrapper[4913]: E1001 12:55:40.871950 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="ceilometer-notification-agent" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.871959 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="ceilometer-notification-agent" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.872179 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="proxy-httpd" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.872204 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="sg-core" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.872225 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="ceilometer-notification-agent" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.872246 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" containerName="ceilometer-central-agent" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.885554 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.889171 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.889440 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.889913 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.897254 4913 scope.go:117] "RemoveContainer" containerID="897227f6bb01635960ef96780c51c66f66615973ed3ef9573ec85705546430ae" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.913270 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-config-data\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.913340 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs9tl\" (UniqueName: \"kubernetes.io/projected/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-kube-api-access-gs9tl\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.913393 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.913426 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-log-httpd\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.913614 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.913631 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-run-httpd\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:40 crc kubenswrapper[4913]: I1001 12:55:40.913684 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-scripts\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.015091 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.015128 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-run-httpd\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.015152 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-scripts\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.015645 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-run-httpd\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.015698 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-config-data\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.015761 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs9tl\" (UniqueName: \"kubernetes.io/projected/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-kube-api-access-gs9tl\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.015794 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.016436 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-log-httpd\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.016890 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-log-httpd\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.020845 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.021081 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-scripts\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.021310 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-config-data\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.022916 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.033217 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs9tl\" (UniqueName: \"kubernetes.io/projected/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-kube-api-access-gs9tl\") pod \"ceilometer-0\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.215675 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.690302 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:41 crc kubenswrapper[4913]: I1001 12:55:41.782438 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a245020b-93c0-45eb-8ccc-ec2ed1c8392c","Type":"ContainerStarted","Data":"46f1f5be0f1b7a073aa6853acaa70f0ab7c3a0137cad980b6a5582b4cdf123ef"} Oct 01 12:55:42 crc kubenswrapper[4913]: I1001 12:55:42.792390 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a245020b-93c0-45eb-8ccc-ec2ed1c8392c","Type":"ContainerStarted","Data":"21d22c7b0bc899eca2c99d21c39f495b456ae359e92f84211949f2bb11f47ce6"} Oct 01 12:55:42 crc kubenswrapper[4913]: I1001 12:55:42.795011 4913 generic.go:334] "Generic (PLEG): container finished" podID="ba0e4f52-1a20-4443-98c4-03620eec847f" containerID="527e1694afd53727f343962c8ca2f8b91a5e17ef446bce052be1a32fa1bc7524" exitCode=0 Oct 01 12:55:42 crc kubenswrapper[4913]: I1001 12:55:42.795050 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t5ghk" event={"ID":"ba0e4f52-1a20-4443-98c4-03620eec847f","Type":"ContainerDied","Data":"527e1694afd53727f343962c8ca2f8b91a5e17ef446bce052be1a32fa1bc7524"} Oct 01 12:55:42 crc kubenswrapper[4913]: I1001 12:55:42.817873 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4af94082-eaf8-4d0f-ac71-d7a3055539a0" path="/var/lib/kubelet/pods/4af94082-eaf8-4d0f-ac71-d7a3055539a0/volumes" Oct 01 12:55:43 crc kubenswrapper[4913]: I1001 12:55:43.811420 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a245020b-93c0-45eb-8ccc-ec2ed1c8392c","Type":"ContainerStarted","Data":"1be0ad7d5fe4b12c23b476e46e37b7e18d4648f39f9e396c6a617eb9465ee0a6"} Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.173131 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.277117 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-config-data\") pod \"ba0e4f52-1a20-4443-98c4-03620eec847f\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.277363 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-scripts\") pod \"ba0e4f52-1a20-4443-98c4-03620eec847f\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.277450 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gqv6\" (UniqueName: \"kubernetes.io/projected/ba0e4f52-1a20-4443-98c4-03620eec847f-kube-api-access-2gqv6\") pod \"ba0e4f52-1a20-4443-98c4-03620eec847f\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.277486 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-combined-ca-bundle\") pod \"ba0e4f52-1a20-4443-98c4-03620eec847f\" (UID: \"ba0e4f52-1a20-4443-98c4-03620eec847f\") " Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.296583 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-scripts" (OuterVolumeSpecName: "scripts") pod "ba0e4f52-1a20-4443-98c4-03620eec847f" (UID: "ba0e4f52-1a20-4443-98c4-03620eec847f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.296611 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba0e4f52-1a20-4443-98c4-03620eec847f-kube-api-access-2gqv6" (OuterVolumeSpecName: "kube-api-access-2gqv6") pod "ba0e4f52-1a20-4443-98c4-03620eec847f" (UID: "ba0e4f52-1a20-4443-98c4-03620eec847f"). InnerVolumeSpecName "kube-api-access-2gqv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.302824 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-config-data" (OuterVolumeSpecName: "config-data") pod "ba0e4f52-1a20-4443-98c4-03620eec847f" (UID: "ba0e4f52-1a20-4443-98c4-03620eec847f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.327386 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba0e4f52-1a20-4443-98c4-03620eec847f" (UID: "ba0e4f52-1a20-4443-98c4-03620eec847f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.379933 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gqv6\" (UniqueName: \"kubernetes.io/projected/ba0e4f52-1a20-4443-98c4-03620eec847f-kube-api-access-2gqv6\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.379980 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.379994 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.380006 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba0e4f52-1a20-4443-98c4-03620eec847f-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.853038 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t5ghk" event={"ID":"ba0e4f52-1a20-4443-98c4-03620eec847f","Type":"ContainerDied","Data":"2ea498f1b799cf45b13eab183b39e90f74a8fe8940a52972ea96a5af52a42788"} Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.853081 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ea498f1b799cf45b13eab183b39e90f74a8fe8940a52972ea96a5af52a42788" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.853055 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t5ghk" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.858367 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a245020b-93c0-45eb-8ccc-ec2ed1c8392c","Type":"ContainerStarted","Data":"9c6efa8a635a2223c347cc4c2dff7b60119f4853060b4b3adf67681b29537356"} Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.911204 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 12:55:44 crc kubenswrapper[4913]: E1001 12:55:44.911640 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba0e4f52-1a20-4443-98c4-03620eec847f" containerName="nova-cell0-conductor-db-sync" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.911666 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba0e4f52-1a20-4443-98c4-03620eec847f" containerName="nova-cell0-conductor-db-sync" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.911903 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba0e4f52-1a20-4443-98c4-03620eec847f" containerName="nova-cell0-conductor-db-sync" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.912571 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.915661 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.917500 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jz8n5" Oct 01 12:55:44 crc kubenswrapper[4913]: I1001 12:55:44.949258 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 12:55:45 crc kubenswrapper[4913]: I1001 12:55:45.093730 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s96v\" (UniqueName: \"kubernetes.io/projected/ca9902fd-1820-476e-997b-78f8f80c9d10-kube-api-access-2s96v\") pod \"nova-cell0-conductor-0\" (UID: \"ca9902fd-1820-476e-997b-78f8f80c9d10\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:45 crc kubenswrapper[4913]: I1001 12:55:45.093812 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9902fd-1820-476e-997b-78f8f80c9d10-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ca9902fd-1820-476e-997b-78f8f80c9d10\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:45 crc kubenswrapper[4913]: I1001 12:55:45.093966 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca9902fd-1820-476e-997b-78f8f80c9d10-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ca9902fd-1820-476e-997b-78f8f80c9d10\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:45 crc kubenswrapper[4913]: I1001 12:55:45.195228 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca9902fd-1820-476e-997b-78f8f80c9d10-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ca9902fd-1820-476e-997b-78f8f80c9d10\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:45 crc kubenswrapper[4913]: I1001 12:55:45.195559 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s96v\" (UniqueName: \"kubernetes.io/projected/ca9902fd-1820-476e-997b-78f8f80c9d10-kube-api-access-2s96v\") pod \"nova-cell0-conductor-0\" (UID: \"ca9902fd-1820-476e-997b-78f8f80c9d10\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:45 crc kubenswrapper[4913]: I1001 12:55:45.195700 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9902fd-1820-476e-997b-78f8f80c9d10-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ca9902fd-1820-476e-997b-78f8f80c9d10\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:45 crc kubenswrapper[4913]: I1001 12:55:45.200152 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca9902fd-1820-476e-997b-78f8f80c9d10-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ca9902fd-1820-476e-997b-78f8f80c9d10\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:45 crc kubenswrapper[4913]: I1001 12:55:45.208583 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9902fd-1820-476e-997b-78f8f80c9d10-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ca9902fd-1820-476e-997b-78f8f80c9d10\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:45 crc kubenswrapper[4913]: I1001 12:55:45.214891 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s96v\" (UniqueName: \"kubernetes.io/projected/ca9902fd-1820-476e-997b-78f8f80c9d10-kube-api-access-2s96v\") pod \"nova-cell0-conductor-0\" (UID: \"ca9902fd-1820-476e-997b-78f8f80c9d10\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:45 crc kubenswrapper[4913]: I1001 12:55:45.237202 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:45 crc kubenswrapper[4913]: I1001 12:55:45.688444 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 12:55:45 crc kubenswrapper[4913]: W1001 12:55:45.691443 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca9902fd_1820_476e_997b_78f8f80c9d10.slice/crio-bb48261434a0e8e0eddee7e95d3ed3d5f5d7bc9eadf3df2914199c31fa985eec WatchSource:0}: Error finding container bb48261434a0e8e0eddee7e95d3ed3d5f5d7bc9eadf3df2914199c31fa985eec: Status 404 returned error can't find the container with id bb48261434a0e8e0eddee7e95d3ed3d5f5d7bc9eadf3df2914199c31fa985eec Oct 01 12:55:45 crc kubenswrapper[4913]: I1001 12:55:45.870508 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a245020b-93c0-45eb-8ccc-ec2ed1c8392c","Type":"ContainerStarted","Data":"79a793b0f411caaf80f2818cae3f8396c991997c761ed8af617553b5a8df2b41"} Oct 01 12:55:45 crc kubenswrapper[4913]: I1001 12:55:45.871498 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:55:45 crc kubenswrapper[4913]: I1001 12:55:45.872390 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ca9902fd-1820-476e-997b-78f8f80c9d10","Type":"ContainerStarted","Data":"bb48261434a0e8e0eddee7e95d3ed3d5f5d7bc9eadf3df2914199c31fa985eec"} Oct 01 12:55:46 crc kubenswrapper[4913]: I1001 12:55:46.890522 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ca9902fd-1820-476e-997b-78f8f80c9d10","Type":"ContainerStarted","Data":"edadff2455c25b77bba3042ad5d9e1058d605876b8f3132992d7c354882d10cc"} Oct 01 12:55:46 crc kubenswrapper[4913]: I1001 12:55:46.910784 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.910763346 podStartE2EDuration="2.910763346s" podCreationTimestamp="2025-10-01 12:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:46.906445978 +0000 UTC m=+1078.809921566" watchObservedRunningTime="2025-10-01 12:55:46.910763346 +0000 UTC m=+1078.814238924" Oct 01 12:55:46 crc kubenswrapper[4913]: I1001 12:55:46.916378 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.4024198549999998 podStartE2EDuration="6.91635892s" podCreationTimestamp="2025-10-01 12:55:40 +0000 UTC" firstStartedPulling="2025-10-01 12:55:41.701682816 +0000 UTC m=+1073.605158394" lastFinishedPulling="2025-10-01 12:55:45.215621881 +0000 UTC m=+1077.119097459" observedRunningTime="2025-10-01 12:55:45.899805599 +0000 UTC m=+1077.803281197" watchObservedRunningTime="2025-10-01 12:55:46.91635892 +0000 UTC m=+1078.819834498" Oct 01 12:55:47 crc kubenswrapper[4913]: I1001 12:55:47.815009 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:55:47 crc kubenswrapper[4913]: I1001 12:55:47.898810 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:48 crc kubenswrapper[4913]: I1001 12:55:48.006329 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="51ed98c9-9585-44d2-a913-ebdcfa04ac53" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.146:3000/\": dial tcp 10.217.0.146:3000: i/o timeout (Client.Timeout exceeded while awaiting headers)" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.273904 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.748825 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-8gshc"] Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.750039 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.753283 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.757040 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.762169 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8gshc"] Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.788253 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcg72\" (UniqueName: \"kubernetes.io/projected/36c41052-7191-462a-9627-9a2fbe9206b3-kube-api-access-rcg72\") pod \"nova-cell0-cell-mapping-8gshc\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.788310 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-config-data\") pod \"nova-cell0-cell-mapping-8gshc\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.788350 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8gshc\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.788369 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-scripts\") pod \"nova-cell0-cell-mapping-8gshc\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.903063 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcg72\" (UniqueName: \"kubernetes.io/projected/36c41052-7191-462a-9627-9a2fbe9206b3-kube-api-access-rcg72\") pod \"nova-cell0-cell-mapping-8gshc\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.903122 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-config-data\") pod \"nova-cell0-cell-mapping-8gshc\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.903224 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8gshc\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.903249 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-scripts\") pod \"nova-cell0-cell-mapping-8gshc\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.914774 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8gshc\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.916445 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-config-data\") pod \"nova-cell0-cell-mapping-8gshc\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.940843 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-scripts\") pod \"nova-cell0-cell-mapping-8gshc\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.944886 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcg72\" (UniqueName: \"kubernetes.io/projected/36c41052-7191-462a-9627-9a2fbe9206b3-kube-api-access-rcg72\") pod \"nova-cell0-cell-mapping-8gshc\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:55:50 crc kubenswrapper[4913]: I1001 12:55:50.998596 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.005894 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.009679 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.060394 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.062005 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.070794 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.092341 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.096790 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.114461 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195c6907-2401-4fd4-8d60-8c4c7d5a6266-config-data\") pod \"nova-metadata-0\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.114734 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kggj2\" (UniqueName: \"kubernetes.io/projected/67969aa2-bca2-4e09-b0c7-cc8949849046-kube-api-access-kggj2\") pod \"nova-scheduler-0\" (UID: \"67969aa2-bca2-4e09-b0c7-cc8949849046\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.114834 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/195c6907-2401-4fd4-8d60-8c4c7d5a6266-logs\") pod \"nova-metadata-0\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.114921 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195c6907-2401-4fd4-8d60-8c4c7d5a6266-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.115001 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67969aa2-bca2-4e09-b0c7-cc8949849046-config-data\") pod \"nova-scheduler-0\" (UID: \"67969aa2-bca2-4e09-b0c7-cc8949849046\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.115090 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-447pr\" (UniqueName: \"kubernetes.io/projected/195c6907-2401-4fd4-8d60-8c4c7d5a6266-kube-api-access-447pr\") pod \"nova-metadata-0\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.115197 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67969aa2-bca2-4e09-b0c7-cc8949849046-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67969aa2-bca2-4e09-b0c7-cc8949849046\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.129344 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.149754 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6854dd75d7-6cgpn" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.195556 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.197073 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.201374 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-745b454497-drt5d"] Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.209516 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.212398 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.213589 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.215671 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.215816 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.217578 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nzf2\" (UniqueName: \"kubernetes.io/projected/122bc201-edae-47f2-a752-818ba02b0dea-kube-api-access-6nzf2\") pod \"nova-cell1-novncproxy-0\" (UID: \"122bc201-edae-47f2-a752-818ba02b0dea\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.217622 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/195c6907-2401-4fd4-8d60-8c4c7d5a6266-logs\") pod \"nova-metadata-0\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.217653 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195c6907-2401-4fd4-8d60-8c4c7d5a6266-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.217668 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67969aa2-bca2-4e09-b0c7-cc8949849046-config-data\") pod \"nova-scheduler-0\" (UID: \"67969aa2-bca2-4e09-b0c7-cc8949849046\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.217683 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-447pr\" (UniqueName: \"kubernetes.io/projected/195c6907-2401-4fd4-8d60-8c4c7d5a6266-kube-api-access-447pr\") pod \"nova-metadata-0\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.217716 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfm2l\" (UniqueName: \"kubernetes.io/projected/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-kube-api-access-gfm2l\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.217736 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-dns-svc\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.217754 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122bc201-edae-47f2-a752-818ba02b0dea-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"122bc201-edae-47f2-a752-818ba02b0dea\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.217790 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67969aa2-bca2-4e09-b0c7-cc8949849046-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67969aa2-bca2-4e09-b0c7-cc8949849046\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.217822 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d0dce13-24d2-41a8-8464-ea574800e22c-logs\") pod \"nova-api-0\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " pod="openstack/nova-api-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.217850 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0dce13-24d2-41a8-8464-ea574800e22c-config-data\") pod \"nova-api-0\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " pod="openstack/nova-api-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.217886 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4fpp\" (UniqueName: \"kubernetes.io/projected/8d0dce13-24d2-41a8-8464-ea574800e22c-kube-api-access-v4fpp\") pod \"nova-api-0\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " pod="openstack/nova-api-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.217936 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-ovsdbserver-nb\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.217991 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-config\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.218033 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122bc201-edae-47f2-a752-818ba02b0dea-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"122bc201-edae-47f2-a752-818ba02b0dea\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.218063 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195c6907-2401-4fd4-8d60-8c4c7d5a6266-config-data\") pod \"nova-metadata-0\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.218082 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-ovsdbserver-sb\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.218104 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0dce13-24d2-41a8-8464-ea574800e22c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " pod="openstack/nova-api-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.218130 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kggj2\" (UniqueName: \"kubernetes.io/projected/67969aa2-bca2-4e09-b0c7-cc8949849046-kube-api-access-kggj2\") pod \"nova-scheduler-0\" (UID: \"67969aa2-bca2-4e09-b0c7-cc8949849046\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.218298 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/195c6907-2401-4fd4-8d60-8c4c7d5a6266-logs\") pod \"nova-metadata-0\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.239818 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195c6907-2401-4fd4-8d60-8c4c7d5a6266-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.242230 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67969aa2-bca2-4e09-b0c7-cc8949849046-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67969aa2-bca2-4e09-b0c7-cc8949849046\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.244386 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67969aa2-bca2-4e09-b0c7-cc8949849046-config-data\") pod \"nova-scheduler-0\" (UID: \"67969aa2-bca2-4e09-b0c7-cc8949849046\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.244701 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195c6907-2401-4fd4-8d60-8c4c7d5a6266-config-data\") pod \"nova-metadata-0\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.251920 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-447pr\" (UniqueName: \"kubernetes.io/projected/195c6907-2401-4fd4-8d60-8c4c7d5a6266-kube-api-access-447pr\") pod \"nova-metadata-0\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.257877 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kggj2\" (UniqueName: \"kubernetes.io/projected/67969aa2-bca2-4e09-b0c7-cc8949849046-kube-api-access-kggj2\") pod \"nova-scheduler-0\" (UID: \"67969aa2-bca2-4e09-b0c7-cc8949849046\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.291260 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.314073 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.320751 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfm2l\" (UniqueName: \"kubernetes.io/projected/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-kube-api-access-gfm2l\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.320782 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-dns-svc\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.320803 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122bc201-edae-47f2-a752-818ba02b0dea-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"122bc201-edae-47f2-a752-818ba02b0dea\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.320842 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d0dce13-24d2-41a8-8464-ea574800e22c-logs\") pod \"nova-api-0\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " pod="openstack/nova-api-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.320865 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0dce13-24d2-41a8-8464-ea574800e22c-config-data\") pod \"nova-api-0\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " pod="openstack/nova-api-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.320889 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4fpp\" (UniqueName: \"kubernetes.io/projected/8d0dce13-24d2-41a8-8464-ea574800e22c-kube-api-access-v4fpp\") pod \"nova-api-0\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " pod="openstack/nova-api-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.320907 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-ovsdbserver-nb\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.320937 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-config\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.320969 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122bc201-edae-47f2-a752-818ba02b0dea-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"122bc201-edae-47f2-a752-818ba02b0dea\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.320992 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-ovsdbserver-sb\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.321011 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0dce13-24d2-41a8-8464-ea574800e22c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " pod="openstack/nova-api-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.321036 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nzf2\" (UniqueName: \"kubernetes.io/projected/122bc201-edae-47f2-a752-818ba02b0dea-kube-api-access-6nzf2\") pod \"nova-cell1-novncproxy-0\" (UID: \"122bc201-edae-47f2-a752-818ba02b0dea\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.322594 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-dns-svc\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.324924 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-ovsdbserver-nb\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.325204 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d0dce13-24d2-41a8-8464-ea574800e22c-logs\") pod \"nova-api-0\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " pod="openstack/nova-api-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.328950 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122bc201-edae-47f2-a752-818ba02b0dea-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"122bc201-edae-47f2-a752-818ba02b0dea\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.329072 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-config\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.331227 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0dce13-24d2-41a8-8464-ea574800e22c-config-data\") pod \"nova-api-0\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " pod="openstack/nova-api-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.332320 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745b454497-drt5d"] Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.332567 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-ovsdbserver-sb\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.333057 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0dce13-24d2-41a8-8464-ea574800e22c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " pod="openstack/nova-api-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.348805 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5759df9d4d-h2pz9"] Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.349057 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5759df9d4d-h2pz9" podUID="839626e4-5aad-4abf-b758-80755e37b5b3" containerName="neutron-api" containerID="cri-o://dad297f85cc097635e915f216c866d1131fbd782e96b3725514e98b52ba268af" gracePeriod=30 Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.349436 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5759df9d4d-h2pz9" podUID="839626e4-5aad-4abf-b758-80755e37b5b3" containerName="neutron-httpd" containerID="cri-o://74797fe20c5f882f1deddc601251f5765b0a34705cd4616ab5cb24147e03b90a" gracePeriod=30 Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.350495 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.353045 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122bc201-edae-47f2-a752-818ba02b0dea-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"122bc201-edae-47f2-a752-818ba02b0dea\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.356351 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfm2l\" (UniqueName: \"kubernetes.io/projected/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-kube-api-access-gfm2l\") pod \"dnsmasq-dns-745b454497-drt5d\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.360620 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nzf2\" (UniqueName: \"kubernetes.io/projected/122bc201-edae-47f2-a752-818ba02b0dea-kube-api-access-6nzf2\") pod \"nova-cell1-novncproxy-0\" (UID: \"122bc201-edae-47f2-a752-818ba02b0dea\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.363019 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4fpp\" (UniqueName: \"kubernetes.io/projected/8d0dce13-24d2-41a8-8464-ea574800e22c-kube-api-access-v4fpp\") pod \"nova-api-0\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " pod="openstack/nova-api-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.496570 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.603519 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.619970 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.629672 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.670705 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8gshc"] Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.823471 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qz2dw"] Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.825801 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.830041 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.830293 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.844255 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.864090 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qz2dw"] Oct 01 12:55:51 crc kubenswrapper[4913]: W1001 12:55:51.875792 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod195c6907_2401_4fd4_8d60_8c4c7d5a6266.slice/crio-8b1f9f9c482532cb32a9e5f81980a87dd74ec705936109acafa67aacabaa33e5 WatchSource:0}: Error finding container 8b1f9f9c482532cb32a9e5f81980a87dd74ec705936109acafa67aacabaa33e5: Status 404 returned error can't find the container with id 8b1f9f9c482532cb32a9e5f81980a87dd74ec705936109acafa67aacabaa33e5 Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.934338 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w6gs\" (UniqueName: \"kubernetes.io/projected/d06000dd-9a73-4695-a477-0f361c61cf57-kube-api-access-7w6gs\") pod \"nova-cell1-conductor-db-sync-qz2dw\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.934395 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-scripts\") pod \"nova-cell1-conductor-db-sync-qz2dw\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.934443 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-config-data\") pod \"nova-cell1-conductor-db-sync-qz2dw\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.934716 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qz2dw\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.975448 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"195c6907-2401-4fd4-8d60-8c4c7d5a6266","Type":"ContainerStarted","Data":"8b1f9f9c482532cb32a9e5f81980a87dd74ec705936109acafa67aacabaa33e5"} Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.976899 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8gshc" event={"ID":"36c41052-7191-462a-9627-9a2fbe9206b3","Type":"ContainerStarted","Data":"feed6750a08eca3f6a749e9f118ffd5c65abeea2b023fca0a5b65ceb9a9f2cd4"} Oct 01 12:55:51 crc kubenswrapper[4913]: I1001 12:55:51.981741 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:55:51 crc kubenswrapper[4913]: W1001 12:55:51.991771 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67969aa2_bca2_4e09_b0c7_cc8949849046.slice/crio-9b4a2d892a66304159fab5b24ba08917a1407f9e197844f628c7621e3db5c53f WatchSource:0}: Error finding container 9b4a2d892a66304159fab5b24ba08917a1407f9e197844f628c7621e3db5c53f: Status 404 returned error can't find the container with id 9b4a2d892a66304159fab5b24ba08917a1407f9e197844f628c7621e3db5c53f Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.036451 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-config-data\") pod \"nova-cell1-conductor-db-sync-qz2dw\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.036579 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qz2dw\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.036670 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w6gs\" (UniqueName: \"kubernetes.io/projected/d06000dd-9a73-4695-a477-0f361c61cf57-kube-api-access-7w6gs\") pod \"nova-cell1-conductor-db-sync-qz2dw\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.036717 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-scripts\") pod \"nova-cell1-conductor-db-sync-qz2dw\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.041528 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-config-data\") pod \"nova-cell1-conductor-db-sync-qz2dw\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.041683 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-scripts\") pod \"nova-cell1-conductor-db-sync-qz2dw\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.042966 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qz2dw\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.053307 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w6gs\" (UniqueName: \"kubernetes.io/projected/d06000dd-9a73-4695-a477-0f361c61cf57-kube-api-access-7w6gs\") pod \"nova-cell1-conductor-db-sync-qz2dw\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.150716 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.238323 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745b454497-drt5d"] Oct 01 12:55:52 crc kubenswrapper[4913]: W1001 12:55:52.247671 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e51d4e0_c0a4_4a49_8d6c_7f063a3bc973.slice/crio-3503c16e9f3a159158e5215cb3822f052519a4c68123be5ea5a967d37d229ecf WatchSource:0}: Error finding container 3503c16e9f3a159158e5215cb3822f052519a4c68123be5ea5a967d37d229ecf: Status 404 returned error can't find the container with id 3503c16e9f3a159158e5215cb3822f052519a4c68123be5ea5a967d37d229ecf Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.311133 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.326838 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:55:52 crc kubenswrapper[4913]: W1001 12:55:52.364720 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d0dce13_24d2_41a8_8464_ea574800e22c.slice/crio-6b3d175d01ba88be00a0245f1699aaa42dc2f7595e1bffb40f37d19d56ea4c43 WatchSource:0}: Error finding container 6b3d175d01ba88be00a0245f1699aaa42dc2f7595e1bffb40f37d19d56ea4c43: Status 404 returned error can't find the container with id 6b3d175d01ba88be00a0245f1699aaa42dc2f7595e1bffb40f37d19d56ea4c43 Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.611819 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qz2dw"] Oct 01 12:55:52 crc kubenswrapper[4913]: W1001 12:55:52.632565 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd06000dd_9a73_4695_a477_0f361c61cf57.slice/crio-b9252322212e94033b1b3cc97a95fdc2489a552d66fb90d762a86be9ba602e5c WatchSource:0}: Error finding container b9252322212e94033b1b3cc97a95fdc2489a552d66fb90d762a86be9ba602e5c: Status 404 returned error can't find the container with id b9252322212e94033b1b3cc97a95fdc2489a552d66fb90d762a86be9ba602e5c Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.986525 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8gshc" event={"ID":"36c41052-7191-462a-9627-9a2fbe9206b3","Type":"ContainerStarted","Data":"72b17a12133d5f02c9052daa557d3aaf68bbdf8173d07b3ace9edd03f049c752"} Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.988242 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"122bc201-edae-47f2-a752-818ba02b0dea","Type":"ContainerStarted","Data":"066d22631dfb3b92118491ccb38cf880f29f27f41412deb3734be63779a22fac"} Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.989208 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"67969aa2-bca2-4e09-b0c7-cc8949849046","Type":"ContainerStarted","Data":"9b4a2d892a66304159fab5b24ba08917a1407f9e197844f628c7621e3db5c53f"} Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.990241 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qz2dw" event={"ID":"d06000dd-9a73-4695-a477-0f361c61cf57","Type":"ContainerStarted","Data":"b9252322212e94033b1b3cc97a95fdc2489a552d66fb90d762a86be9ba602e5c"} Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.992380 4913 generic.go:334] "Generic (PLEG): container finished" podID="839626e4-5aad-4abf-b758-80755e37b5b3" containerID="74797fe20c5f882f1deddc601251f5765b0a34705cd4616ab5cb24147e03b90a" exitCode=0 Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.992442 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5759df9d4d-h2pz9" event={"ID":"839626e4-5aad-4abf-b758-80755e37b5b3","Type":"ContainerDied","Data":"74797fe20c5f882f1deddc601251f5765b0a34705cd4616ab5cb24147e03b90a"} Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.993634 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b454497-drt5d" event={"ID":"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973","Type":"ContainerStarted","Data":"9e5175b326378e964c8f4d2bad6ffbc15b77400c42f9d9241aa71f70c34bd512"} Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.993671 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b454497-drt5d" event={"ID":"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973","Type":"ContainerStarted","Data":"3503c16e9f3a159158e5215cb3822f052519a4c68123be5ea5a967d37d229ecf"} Oct 01 12:55:52 crc kubenswrapper[4913]: I1001 12:55:52.996471 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d0dce13-24d2-41a8-8464-ea574800e22c","Type":"ContainerStarted","Data":"6b3d175d01ba88be00a0245f1699aaa42dc2f7595e1bffb40f37d19d56ea4c43"} Oct 01 12:55:54 crc kubenswrapper[4913]: I1001 12:55:54.005858 4913 generic.go:334] "Generic (PLEG): container finished" podID="9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973" containerID="9e5175b326378e964c8f4d2bad6ffbc15b77400c42f9d9241aa71f70c34bd512" exitCode=0 Oct 01 12:55:54 crc kubenswrapper[4913]: I1001 12:55:54.005903 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b454497-drt5d" event={"ID":"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973","Type":"ContainerDied","Data":"9e5175b326378e964c8f4d2bad6ffbc15b77400c42f9d9241aa71f70c34bd512"} Oct 01 12:55:54 crc kubenswrapper[4913]: I1001 12:55:54.007554 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qz2dw" event={"ID":"d06000dd-9a73-4695-a477-0f361c61cf57","Type":"ContainerStarted","Data":"22b594d3a847571776800fd6b7741c6cebdb1088a933b58e14ad14dd505c11e3"} Oct 01 12:55:54 crc kubenswrapper[4913]: I1001 12:55:54.073851 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-8gshc" podStartSLOduration=4.073834527 podStartE2EDuration="4.073834527s" podCreationTimestamp="2025-10-01 12:55:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:54.072881671 +0000 UTC m=+1085.976357269" watchObservedRunningTime="2025-10-01 12:55:54.073834527 +0000 UTC m=+1085.977310105" Oct 01 12:55:54 crc kubenswrapper[4913]: I1001 12:55:54.091915 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-qz2dw" podStartSLOduration=3.091898314 podStartE2EDuration="3.091898314s" podCreationTimestamp="2025-10-01 12:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:54.086407743 +0000 UTC m=+1085.989883331" watchObservedRunningTime="2025-10-01 12:55:54.091898314 +0000 UTC m=+1085.995373892" Oct 01 12:55:54 crc kubenswrapper[4913]: I1001 12:55:54.335443 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:54 crc kubenswrapper[4913]: I1001 12:55:54.348093 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:55:55 crc kubenswrapper[4913]: I1001 12:55:55.017794 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b454497-drt5d" event={"ID":"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973","Type":"ContainerStarted","Data":"8be06082a90483ac727a581c388cd02ac073187cbf0e5094acda07d40b509407"} Oct 01 12:55:57 crc kubenswrapper[4913]: I1001 12:55:57.041478 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:55:57 crc kubenswrapper[4913]: I1001 12:55:57.067800 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-745b454497-drt5d" podStartSLOduration=6.067778131 podStartE2EDuration="6.067778131s" podCreationTimestamp="2025-10-01 12:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:57.058413514 +0000 UTC m=+1088.961889112" watchObservedRunningTime="2025-10-01 12:55:57.067778131 +0000 UTC m=+1088.971253739" Oct 01 12:56:01 crc kubenswrapper[4913]: I1001 12:56:01.622332 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:56:01 crc kubenswrapper[4913]: I1001 12:56:01.691970 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-659bcf5cf5-8vzps"] Oct 01 12:56:01 crc kubenswrapper[4913]: I1001 12:56:01.692604 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" podUID="19e92ca3-0ea8-408d-902d-d0c6e283129f" containerName="dnsmasq-dns" containerID="cri-o://8b186600a4b73e6dae5d57fa49e1fec309885f18083f6cb414480d396b9efba8" gracePeriod=10 Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.106755 4913 generic.go:334] "Generic (PLEG): container finished" podID="839626e4-5aad-4abf-b758-80755e37b5b3" containerID="dad297f85cc097635e915f216c866d1131fbd782e96b3725514e98b52ba268af" exitCode=0 Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.106807 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5759df9d4d-h2pz9" event={"ID":"839626e4-5aad-4abf-b758-80755e37b5b3","Type":"ContainerDied","Data":"dad297f85cc097635e915f216c866d1131fbd782e96b3725514e98b52ba268af"} Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.108706 4913 generic.go:334] "Generic (PLEG): container finished" podID="19e92ca3-0ea8-408d-902d-d0c6e283129f" containerID="8b186600a4b73e6dae5d57fa49e1fec309885f18083f6cb414480d396b9efba8" exitCode=0 Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.108737 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" event={"ID":"19e92ca3-0ea8-408d-902d-d0c6e283129f","Type":"ContainerDied","Data":"8b186600a4b73e6dae5d57fa49e1fec309885f18083f6cb414480d396b9efba8"} Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.110626 4913 generic.go:334] "Generic (PLEG): container finished" podID="36c41052-7191-462a-9627-9a2fbe9206b3" containerID="72b17a12133d5f02c9052daa557d3aaf68bbdf8173d07b3ace9edd03f049c752" exitCode=0 Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.110742 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8gshc" event={"ID":"36c41052-7191-462a-9627-9a2fbe9206b3","Type":"ContainerDied","Data":"72b17a12133d5f02c9052daa557d3aaf68bbdf8173d07b3ace9edd03f049c752"} Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.583412 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.731048 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-config\") pod \"839626e4-5aad-4abf-b758-80755e37b5b3\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.731108 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-combined-ca-bundle\") pod \"839626e4-5aad-4abf-b758-80755e37b5b3\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.731284 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-ovndb-tls-certs\") pod \"839626e4-5aad-4abf-b758-80755e37b5b3\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.731373 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-httpd-config\") pod \"839626e4-5aad-4abf-b758-80755e37b5b3\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.731416 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85wk2\" (UniqueName: \"kubernetes.io/projected/839626e4-5aad-4abf-b758-80755e37b5b3-kube-api-access-85wk2\") pod \"839626e4-5aad-4abf-b758-80755e37b5b3\" (UID: \"839626e4-5aad-4abf-b758-80755e37b5b3\") " Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.740693 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "839626e4-5aad-4abf-b758-80755e37b5b3" (UID: "839626e4-5aad-4abf-b758-80755e37b5b3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.740739 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/839626e4-5aad-4abf-b758-80755e37b5b3-kube-api-access-85wk2" (OuterVolumeSpecName: "kube-api-access-85wk2") pod "839626e4-5aad-4abf-b758-80755e37b5b3" (UID: "839626e4-5aad-4abf-b758-80755e37b5b3"). InnerVolumeSpecName "kube-api-access-85wk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.785395 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-config" (OuterVolumeSpecName: "config") pod "839626e4-5aad-4abf-b758-80755e37b5b3" (UID: "839626e4-5aad-4abf-b758-80755e37b5b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.786389 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "839626e4-5aad-4abf-b758-80755e37b5b3" (UID: "839626e4-5aad-4abf-b758-80755e37b5b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.833510 4913 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.833624 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85wk2\" (UniqueName: \"kubernetes.io/projected/839626e4-5aad-4abf-b758-80755e37b5b3-kube-api-access-85wk2\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.833687 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.833744 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.837553 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "839626e4-5aad-4abf-b758-80755e37b5b3" (UID: "839626e4-5aad-4abf-b758-80755e37b5b3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:02 crc kubenswrapper[4913]: I1001 12:56:02.936023 4913 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/839626e4-5aad-4abf-b758-80755e37b5b3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.091218 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.151078 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5759df9d4d-h2pz9" event={"ID":"839626e4-5aad-4abf-b758-80755e37b5b3","Type":"ContainerDied","Data":"0e81c03d8deaa45230204b9bedbfca94079e33477b480922eae275504d6a09b1"} Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.151125 4913 scope.go:117] "RemoveContainer" containerID="74797fe20c5f882f1deddc601251f5765b0a34705cd4616ab5cb24147e03b90a" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.151241 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5759df9d4d-h2pz9" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.160738 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" event={"ID":"19e92ca3-0ea8-408d-902d-d0c6e283129f","Type":"ContainerDied","Data":"79176e0a1437531fc84f94fd9e546ad06693fe8fa88dfe2bd6b33e8a8d608187"} Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.160782 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.190421 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5759df9d4d-h2pz9"] Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.199040 4913 scope.go:117] "RemoveContainer" containerID="dad297f85cc097635e915f216c866d1131fbd782e96b3725514e98b52ba268af" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.200740 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5759df9d4d-h2pz9"] Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.227531 4913 scope.go:117] "RemoveContainer" containerID="8b186600a4b73e6dae5d57fa49e1fec309885f18083f6cb414480d396b9efba8" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.239485 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-ovsdbserver-sb\") pod \"19e92ca3-0ea8-408d-902d-d0c6e283129f\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.239539 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-dns-svc\") pod \"19e92ca3-0ea8-408d-902d-d0c6e283129f\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.239572 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-config\") pod \"19e92ca3-0ea8-408d-902d-d0c6e283129f\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.239604 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4zm7\" (UniqueName: \"kubernetes.io/projected/19e92ca3-0ea8-408d-902d-d0c6e283129f-kube-api-access-g4zm7\") pod \"19e92ca3-0ea8-408d-902d-d0c6e283129f\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.239716 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-ovsdbserver-nb\") pod \"19e92ca3-0ea8-408d-902d-d0c6e283129f\" (UID: \"19e92ca3-0ea8-408d-902d-d0c6e283129f\") " Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.243427 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e92ca3-0ea8-408d-902d-d0c6e283129f-kube-api-access-g4zm7" (OuterVolumeSpecName: "kube-api-access-g4zm7") pod "19e92ca3-0ea8-408d-902d-d0c6e283129f" (UID: "19e92ca3-0ea8-408d-902d-d0c6e283129f"). InnerVolumeSpecName "kube-api-access-g4zm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.283820 4913 scope.go:117] "RemoveContainer" containerID="0add67691b0d7dfbf83722bb94081213fbd4c826adab54dff271ed8ce956d1f6" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.308812 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "19e92ca3-0ea8-408d-902d-d0c6e283129f" (UID: "19e92ca3-0ea8-408d-902d-d0c6e283129f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.314911 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-config" (OuterVolumeSpecName: "config") pod "19e92ca3-0ea8-408d-902d-d0c6e283129f" (UID: "19e92ca3-0ea8-408d-902d-d0c6e283129f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.323369 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "19e92ca3-0ea8-408d-902d-d0c6e283129f" (UID: "19e92ca3-0ea8-408d-902d-d0c6e283129f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.330990 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "19e92ca3-0ea8-408d-902d-d0c6e283129f" (UID: "19e92ca3-0ea8-408d-902d-d0c6e283129f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.342927 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.342960 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.342974 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4zm7\" (UniqueName: \"kubernetes.io/projected/19e92ca3-0ea8-408d-902d-d0c6e283129f-kube-api-access-g4zm7\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.342985 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.342993 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19e92ca3-0ea8-408d-902d-d0c6e283129f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.432561 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.520152 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-659bcf5cf5-8vzps"] Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.527857 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-659bcf5cf5-8vzps"] Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.545383 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcg72\" (UniqueName: \"kubernetes.io/projected/36c41052-7191-462a-9627-9a2fbe9206b3-kube-api-access-rcg72\") pod \"36c41052-7191-462a-9627-9a2fbe9206b3\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.546494 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-scripts\") pod \"36c41052-7191-462a-9627-9a2fbe9206b3\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.546541 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-combined-ca-bundle\") pod \"36c41052-7191-462a-9627-9a2fbe9206b3\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.546729 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-config-data\") pod \"36c41052-7191-462a-9627-9a2fbe9206b3\" (UID: \"36c41052-7191-462a-9627-9a2fbe9206b3\") " Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.554296 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-scripts" (OuterVolumeSpecName: "scripts") pod "36c41052-7191-462a-9627-9a2fbe9206b3" (UID: "36c41052-7191-462a-9627-9a2fbe9206b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.555191 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c41052-7191-462a-9627-9a2fbe9206b3-kube-api-access-rcg72" (OuterVolumeSpecName: "kube-api-access-rcg72") pod "36c41052-7191-462a-9627-9a2fbe9206b3" (UID: "36c41052-7191-462a-9627-9a2fbe9206b3"). InnerVolumeSpecName "kube-api-access-rcg72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.585898 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-config-data" (OuterVolumeSpecName: "config-data") pod "36c41052-7191-462a-9627-9a2fbe9206b3" (UID: "36c41052-7191-462a-9627-9a2fbe9206b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.598450 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36c41052-7191-462a-9627-9a2fbe9206b3" (UID: "36c41052-7191-462a-9627-9a2fbe9206b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.649426 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.649460 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcg72\" (UniqueName: \"kubernetes.io/projected/36c41052-7191-462a-9627-9a2fbe9206b3-kube-api-access-rcg72\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.649470 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:03 crc kubenswrapper[4913]: I1001 12:56:03.649479 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c41052-7191-462a-9627-9a2fbe9206b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.173409 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d0dce13-24d2-41a8-8464-ea574800e22c","Type":"ContainerStarted","Data":"adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7"} Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.173622 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d0dce13-24d2-41a8-8464-ea574800e22c","Type":"ContainerStarted","Data":"51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d"} Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.179102 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"67969aa2-bca2-4e09-b0c7-cc8949849046","Type":"ContainerStarted","Data":"5c106e6cc49513439675197ca109e98afc7deb97981069dcf76d3e1f0176e8b4"} Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.182043 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8gshc" event={"ID":"36c41052-7191-462a-9627-9a2fbe9206b3","Type":"ContainerDied","Data":"feed6750a08eca3f6a749e9f118ffd5c65abeea2b023fca0a5b65ceb9a9f2cd4"} Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.182087 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="feed6750a08eca3f6a749e9f118ffd5c65abeea2b023fca0a5b65ceb9a9f2cd4" Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.182168 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8gshc" Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.191483 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"122bc201-edae-47f2-a752-818ba02b0dea","Type":"ContainerStarted","Data":"3752308e4b5e6451986e149830f5ca336235e0ecbdd182c32a30d28192f1ef97"} Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.191614 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="122bc201-edae-47f2-a752-818ba02b0dea" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3752308e4b5e6451986e149830f5ca336235e0ecbdd182c32a30d28192f1ef97" gracePeriod=30 Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.198080 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.486890878 podStartE2EDuration="13.198062412s" podCreationTimestamp="2025-10-01 12:55:51 +0000 UTC" firstStartedPulling="2025-10-01 12:55:52.373724055 +0000 UTC m=+1084.277199633" lastFinishedPulling="2025-10-01 12:56:03.084895579 +0000 UTC m=+1094.988371167" observedRunningTime="2025-10-01 12:56:04.196997224 +0000 UTC m=+1096.100472822" watchObservedRunningTime="2025-10-01 12:56:04.198062412 +0000 UTC m=+1096.101537990" Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.205861 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"195c6907-2401-4fd4-8d60-8c4c7d5a6266","Type":"ContainerStarted","Data":"c53524a2ac73da774dd330227138670d0311e9959cccb8a0c79479a09a3c8793"} Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.205909 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"195c6907-2401-4fd4-8d60-8c4c7d5a6266","Type":"ContainerStarted","Data":"b651e8bf2c2b144fcf4d9596ec8dc908a6877daadae6ca9a68dd5d8ae0c23d87"} Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.206037 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="195c6907-2401-4fd4-8d60-8c4c7d5a6266" containerName="nova-metadata-log" containerID="cri-o://b651e8bf2c2b144fcf4d9596ec8dc908a6877daadae6ca9a68dd5d8ae0c23d87" gracePeriod=30 Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.206399 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="195c6907-2401-4fd4-8d60-8c4c7d5a6266" containerName="nova-metadata-metadata" containerID="cri-o://c53524a2ac73da774dd330227138670d0311e9959cccb8a0c79479a09a3c8793" gracePeriod=30 Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.221427 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.465269725 podStartE2EDuration="13.221410263s" podCreationTimestamp="2025-10-01 12:55:51 +0000 UTC" firstStartedPulling="2025-10-01 12:55:52.363052412 +0000 UTC m=+1084.266527990" lastFinishedPulling="2025-10-01 12:56:03.11919295 +0000 UTC m=+1095.022668528" observedRunningTime="2025-10-01 12:56:04.218753571 +0000 UTC m=+1096.122229179" watchObservedRunningTime="2025-10-01 12:56:04.221410263 +0000 UTC m=+1096.124885851" Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.244978 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.155552654 podStartE2EDuration="14.24495986s" podCreationTimestamp="2025-10-01 12:55:50 +0000 UTC" firstStartedPulling="2025-10-01 12:55:51.995466893 +0000 UTC m=+1083.898942471" lastFinishedPulling="2025-10-01 12:56:03.084874089 +0000 UTC m=+1094.988349677" observedRunningTime="2025-10-01 12:56:04.237116594 +0000 UTC m=+1096.140592202" watchObservedRunningTime="2025-10-01 12:56:04.24495986 +0000 UTC m=+1096.148435448" Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.266008 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.028460196 podStartE2EDuration="14.265987626s" podCreationTimestamp="2025-10-01 12:55:50 +0000 UTC" firstStartedPulling="2025-10-01 12:55:51.878348619 +0000 UTC m=+1083.781824197" lastFinishedPulling="2025-10-01 12:56:03.115876049 +0000 UTC m=+1095.019351627" observedRunningTime="2025-10-01 12:56:04.262580323 +0000 UTC m=+1096.166055931" watchObservedRunningTime="2025-10-01 12:56:04.265987626 +0000 UTC m=+1096.169463204" Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.306168 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.313799 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.831465 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19e92ca3-0ea8-408d-902d-d0c6e283129f" path="/var/lib/kubelet/pods/19e92ca3-0ea8-408d-902d-d0c6e283129f/volumes" Oct 01 12:56:04 crc kubenswrapper[4913]: I1001 12:56:04.832778 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="839626e4-5aad-4abf-b758-80755e37b5b3" path="/var/lib/kubelet/pods/839626e4-5aad-4abf-b758-80755e37b5b3/volumes" Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.234327 4913 generic.go:334] "Generic (PLEG): container finished" podID="195c6907-2401-4fd4-8d60-8c4c7d5a6266" containerID="c53524a2ac73da774dd330227138670d0311e9959cccb8a0c79479a09a3c8793" exitCode=0 Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.234521 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"195c6907-2401-4fd4-8d60-8c4c7d5a6266","Type":"ContainerDied","Data":"c53524a2ac73da774dd330227138670d0311e9959cccb8a0c79479a09a3c8793"} Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.234549 4913 generic.go:334] "Generic (PLEG): container finished" podID="195c6907-2401-4fd4-8d60-8c4c7d5a6266" containerID="b651e8bf2c2b144fcf4d9596ec8dc908a6877daadae6ca9a68dd5d8ae0c23d87" exitCode=143 Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.234592 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"195c6907-2401-4fd4-8d60-8c4c7d5a6266","Type":"ContainerDied","Data":"b651e8bf2c2b144fcf4d9596ec8dc908a6877daadae6ca9a68dd5d8ae0c23d87"} Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.394411 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.478474 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-447pr\" (UniqueName: \"kubernetes.io/projected/195c6907-2401-4fd4-8d60-8c4c7d5a6266-kube-api-access-447pr\") pod \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.478614 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195c6907-2401-4fd4-8d60-8c4c7d5a6266-config-data\") pod \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.478748 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/195c6907-2401-4fd4-8d60-8c4c7d5a6266-logs\") pod \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.478800 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195c6907-2401-4fd4-8d60-8c4c7d5a6266-combined-ca-bundle\") pod \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\" (UID: \"195c6907-2401-4fd4-8d60-8c4c7d5a6266\") " Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.479126 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/195c6907-2401-4fd4-8d60-8c4c7d5a6266-logs" (OuterVolumeSpecName: "logs") pod "195c6907-2401-4fd4-8d60-8c4c7d5a6266" (UID: "195c6907-2401-4fd4-8d60-8c4c7d5a6266"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.479297 4913 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/195c6907-2401-4fd4-8d60-8c4c7d5a6266-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.498057 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/195c6907-2401-4fd4-8d60-8c4c7d5a6266-kube-api-access-447pr" (OuterVolumeSpecName: "kube-api-access-447pr") pod "195c6907-2401-4fd4-8d60-8c4c7d5a6266" (UID: "195c6907-2401-4fd4-8d60-8c4c7d5a6266"). InnerVolumeSpecName "kube-api-access-447pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.517157 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/195c6907-2401-4fd4-8d60-8c4c7d5a6266-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "195c6907-2401-4fd4-8d60-8c4c7d5a6266" (UID: "195c6907-2401-4fd4-8d60-8c4c7d5a6266"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.519456 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/195c6907-2401-4fd4-8d60-8c4c7d5a6266-config-data" (OuterVolumeSpecName: "config-data") pod "195c6907-2401-4fd4-8d60-8c4c7d5a6266" (UID: "195c6907-2401-4fd4-8d60-8c4c7d5a6266"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.580198 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195c6907-2401-4fd4-8d60-8c4c7d5a6266-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.580240 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195c6907-2401-4fd4-8d60-8c4c7d5a6266-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:05 crc kubenswrapper[4913]: I1001 12:56:05.580252 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-447pr\" (UniqueName: \"kubernetes.io/projected/195c6907-2401-4fd4-8d60-8c4c7d5a6266-kube-api-access-447pr\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.245737 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"195c6907-2401-4fd4-8d60-8c4c7d5a6266","Type":"ContainerDied","Data":"8b1f9f9c482532cb32a9e5f81980a87dd74ec705936109acafa67aacabaa33e5"} Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.245833 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.245841 4913 scope.go:117] "RemoveContainer" containerID="c53524a2ac73da774dd330227138670d0311e9959cccb8a0c79479a09a3c8793" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.245934 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="67969aa2-bca2-4e09-b0c7-cc8949849046" containerName="nova-scheduler-scheduler" containerID="cri-o://5c106e6cc49513439675197ca109e98afc7deb97981069dcf76d3e1f0176e8b4" gracePeriod=30 Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.245988 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8d0dce13-24d2-41a8-8464-ea574800e22c" containerName="nova-api-log" containerID="cri-o://51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d" gracePeriod=30 Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.246059 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8d0dce13-24d2-41a8-8464-ea574800e22c" containerName="nova-api-api" containerID="cri-o://adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7" gracePeriod=30 Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.290256 4913 scope.go:117] "RemoveContainer" containerID="b651e8bf2c2b144fcf4d9596ec8dc908a6877daadae6ca9a68dd5d8ae0c23d87" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.313933 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.323025 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.345948 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:06 crc kubenswrapper[4913]: E1001 12:56:06.346390 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="195c6907-2401-4fd4-8d60-8c4c7d5a6266" containerName="nova-metadata-metadata" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.346406 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="195c6907-2401-4fd4-8d60-8c4c7d5a6266" containerName="nova-metadata-metadata" Oct 01 12:56:06 crc kubenswrapper[4913]: E1001 12:56:06.346417 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e92ca3-0ea8-408d-902d-d0c6e283129f" containerName="dnsmasq-dns" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.346424 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e92ca3-0ea8-408d-902d-d0c6e283129f" containerName="dnsmasq-dns" Oct 01 12:56:06 crc kubenswrapper[4913]: E1001 12:56:06.346445 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="195c6907-2401-4fd4-8d60-8c4c7d5a6266" containerName="nova-metadata-log" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.346452 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="195c6907-2401-4fd4-8d60-8c4c7d5a6266" containerName="nova-metadata-log" Oct 01 12:56:06 crc kubenswrapper[4913]: E1001 12:56:06.346462 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="839626e4-5aad-4abf-b758-80755e37b5b3" containerName="neutron-httpd" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.346467 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="839626e4-5aad-4abf-b758-80755e37b5b3" containerName="neutron-httpd" Oct 01 12:56:06 crc kubenswrapper[4913]: E1001 12:56:06.346477 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e92ca3-0ea8-408d-902d-d0c6e283129f" containerName="init" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.346482 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e92ca3-0ea8-408d-902d-d0c6e283129f" containerName="init" Oct 01 12:56:06 crc kubenswrapper[4913]: E1001 12:56:06.346490 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="839626e4-5aad-4abf-b758-80755e37b5b3" containerName="neutron-api" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.346495 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="839626e4-5aad-4abf-b758-80755e37b5b3" containerName="neutron-api" Oct 01 12:56:06 crc kubenswrapper[4913]: E1001 12:56:06.346510 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c41052-7191-462a-9627-9a2fbe9206b3" containerName="nova-manage" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.346515 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c41052-7191-462a-9627-9a2fbe9206b3" containerName="nova-manage" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.346701 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="195c6907-2401-4fd4-8d60-8c4c7d5a6266" containerName="nova-metadata-log" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.346717 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="839626e4-5aad-4abf-b758-80755e37b5b3" containerName="neutron-api" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.346729 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c41052-7191-462a-9627-9a2fbe9206b3" containerName="nova-manage" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.346743 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e92ca3-0ea8-408d-902d-d0c6e283129f" containerName="dnsmasq-dns" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.346752 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="839626e4-5aad-4abf-b758-80755e37b5b3" containerName="neutron-httpd" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.346761 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="195c6907-2401-4fd4-8d60-8c4c7d5a6266" containerName="nova-metadata-metadata" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.347809 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.349825 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.350056 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.354649 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.498584 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-config-data\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.498632 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.498637 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.498676 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-str8w\" (UniqueName: \"kubernetes.io/projected/57b23d8d-d51d-474a-b84d-644e88b745d5-kube-api-access-str8w\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.498722 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57b23d8d-d51d-474a-b84d-644e88b745d5-logs\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.498786 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.600782 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.600924 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-config-data\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.600944 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.600972 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-str8w\" (UniqueName: \"kubernetes.io/projected/57b23d8d-d51d-474a-b84d-644e88b745d5-kube-api-access-str8w\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.601016 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57b23d8d-d51d-474a-b84d-644e88b745d5-logs\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.601982 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57b23d8d-d51d-474a-b84d-644e88b745d5-logs\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.605182 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-config-data\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.606131 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.607862 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.627091 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-str8w\" (UniqueName: \"kubernetes.io/projected/57b23d8d-d51d-474a-b84d-644e88b745d5-kube-api-access-str8w\") pod \"nova-metadata-0\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.631126 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.713467 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:56:06 crc kubenswrapper[4913]: I1001 12:56:06.825463 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="195c6907-2401-4fd4-8d60-8c4c7d5a6266" path="/var/lib/kubelet/pods/195c6907-2401-4fd4-8d60-8c4c7d5a6266/volumes" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.178300 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.242366 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.263429 4913 generic.go:334] "Generic (PLEG): container finished" podID="67969aa2-bca2-4e09-b0c7-cc8949849046" containerID="5c106e6cc49513439675197ca109e98afc7deb97981069dcf76d3e1f0176e8b4" exitCode=0 Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.263493 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"67969aa2-bca2-4e09-b0c7-cc8949849046","Type":"ContainerDied","Data":"5c106e6cc49513439675197ca109e98afc7deb97981069dcf76d3e1f0176e8b4"} Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.265381 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"57b23d8d-d51d-474a-b84d-644e88b745d5","Type":"ContainerStarted","Data":"90c4fb77eaefe411f55038dfd49791a5c91262cbfd776e2b210f81cef6a313dc"} Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.267011 4913 generic.go:334] "Generic (PLEG): container finished" podID="8d0dce13-24d2-41a8-8464-ea574800e22c" containerID="adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7" exitCode=0 Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.267031 4913 generic.go:334] "Generic (PLEG): container finished" podID="8d0dce13-24d2-41a8-8464-ea574800e22c" containerID="51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d" exitCode=143 Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.267044 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d0dce13-24d2-41a8-8464-ea574800e22c","Type":"ContainerDied","Data":"adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7"} Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.267058 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d0dce13-24d2-41a8-8464-ea574800e22c","Type":"ContainerDied","Data":"51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d"} Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.267068 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d0dce13-24d2-41a8-8464-ea574800e22c","Type":"ContainerDied","Data":"6b3d175d01ba88be00a0245f1699aaa42dc2f7595e1bffb40f37d19d56ea4c43"} Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.267083 4913 scope.go:117] "RemoveContainer" containerID="adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.267093 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.301494 4913 scope.go:117] "RemoveContainer" containerID="51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.311687 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0dce13-24d2-41a8-8464-ea574800e22c-combined-ca-bundle\") pod \"8d0dce13-24d2-41a8-8464-ea574800e22c\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.311791 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0dce13-24d2-41a8-8464-ea574800e22c-config-data\") pod \"8d0dce13-24d2-41a8-8464-ea574800e22c\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.311814 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4fpp\" (UniqueName: \"kubernetes.io/projected/8d0dce13-24d2-41a8-8464-ea574800e22c-kube-api-access-v4fpp\") pod \"8d0dce13-24d2-41a8-8464-ea574800e22c\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.311931 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d0dce13-24d2-41a8-8464-ea574800e22c-logs\") pod \"8d0dce13-24d2-41a8-8464-ea574800e22c\" (UID: \"8d0dce13-24d2-41a8-8464-ea574800e22c\") " Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.312665 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d0dce13-24d2-41a8-8464-ea574800e22c-logs" (OuterVolumeSpecName: "logs") pod "8d0dce13-24d2-41a8-8464-ea574800e22c" (UID: "8d0dce13-24d2-41a8-8464-ea574800e22c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.321437 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d0dce13-24d2-41a8-8464-ea574800e22c-kube-api-access-v4fpp" (OuterVolumeSpecName: "kube-api-access-v4fpp") pod "8d0dce13-24d2-41a8-8464-ea574800e22c" (UID: "8d0dce13-24d2-41a8-8464-ea574800e22c"). InnerVolumeSpecName "kube-api-access-v4fpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.322438 4913 scope.go:117] "RemoveContainer" containerID="adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7" Oct 01 12:56:07 crc kubenswrapper[4913]: E1001 12:56:07.322918 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7\": container with ID starting with adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7 not found: ID does not exist" containerID="adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.322974 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7"} err="failed to get container status \"adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7\": rpc error: code = NotFound desc = could not find container \"adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7\": container with ID starting with adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7 not found: ID does not exist" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.323001 4913 scope.go:117] "RemoveContainer" containerID="51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d" Oct 01 12:56:07 crc kubenswrapper[4913]: E1001 12:56:07.323541 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d\": container with ID starting with 51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d not found: ID does not exist" containerID="51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.323571 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d"} err="failed to get container status \"51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d\": rpc error: code = NotFound desc = could not find container \"51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d\": container with ID starting with 51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d not found: ID does not exist" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.323591 4913 scope.go:117] "RemoveContainer" containerID="adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.323822 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7"} err="failed to get container status \"adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7\": rpc error: code = NotFound desc = could not find container \"adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7\": container with ID starting with adcd974cff0b8b9db99cfa7eb64fa9fca7d225ea26e2dcb4d60525f0b6494db7 not found: ID does not exist" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.323843 4913 scope.go:117] "RemoveContainer" containerID="51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.324556 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d"} err="failed to get container status \"51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d\": rpc error: code = NotFound desc = could not find container \"51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d\": container with ID starting with 51894f554392c3a38075f85e387562c50bbacd20bdafc273d7f7632d1365a17d not found: ID does not exist" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.356095 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.369845 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d0dce13-24d2-41a8-8464-ea574800e22c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d0dce13-24d2-41a8-8464-ea574800e22c" (UID: "8d0dce13-24d2-41a8-8464-ea574800e22c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.381731 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d0dce13-24d2-41a8-8464-ea574800e22c-config-data" (OuterVolumeSpecName: "config-data") pod "8d0dce13-24d2-41a8-8464-ea574800e22c" (UID: "8d0dce13-24d2-41a8-8464-ea574800e22c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.413122 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kggj2\" (UniqueName: \"kubernetes.io/projected/67969aa2-bca2-4e09-b0c7-cc8949849046-kube-api-access-kggj2\") pod \"67969aa2-bca2-4e09-b0c7-cc8949849046\" (UID: \"67969aa2-bca2-4e09-b0c7-cc8949849046\") " Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.413198 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67969aa2-bca2-4e09-b0c7-cc8949849046-combined-ca-bundle\") pod \"67969aa2-bca2-4e09-b0c7-cc8949849046\" (UID: \"67969aa2-bca2-4e09-b0c7-cc8949849046\") " Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.413283 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67969aa2-bca2-4e09-b0c7-cc8949849046-config-data\") pod \"67969aa2-bca2-4e09-b0c7-cc8949849046\" (UID: \"67969aa2-bca2-4e09-b0c7-cc8949849046\") " Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.413722 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0dce13-24d2-41a8-8464-ea574800e22c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.413744 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4fpp\" (UniqueName: \"kubernetes.io/projected/8d0dce13-24d2-41a8-8464-ea574800e22c-kube-api-access-v4fpp\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.413758 4913 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d0dce13-24d2-41a8-8464-ea574800e22c-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.413769 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0dce13-24d2-41a8-8464-ea574800e22c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.420743 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67969aa2-bca2-4e09-b0c7-cc8949849046-kube-api-access-kggj2" (OuterVolumeSpecName: "kube-api-access-kggj2") pod "67969aa2-bca2-4e09-b0c7-cc8949849046" (UID: "67969aa2-bca2-4e09-b0c7-cc8949849046"). InnerVolumeSpecName "kube-api-access-kggj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.440799 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67969aa2-bca2-4e09-b0c7-cc8949849046-config-data" (OuterVolumeSpecName: "config-data") pod "67969aa2-bca2-4e09-b0c7-cc8949849046" (UID: "67969aa2-bca2-4e09-b0c7-cc8949849046"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.446011 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67969aa2-bca2-4e09-b0c7-cc8949849046-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67969aa2-bca2-4e09-b0c7-cc8949849046" (UID: "67969aa2-bca2-4e09-b0c7-cc8949849046"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.517426 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67969aa2-bca2-4e09-b0c7-cc8949849046-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.517465 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67969aa2-bca2-4e09-b0c7-cc8949849046-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.517474 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kggj2\" (UniqueName: \"kubernetes.io/projected/67969aa2-bca2-4e09-b0c7-cc8949849046-kube-api-access-kggj2\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.618692 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.634024 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.646297 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:07 crc kubenswrapper[4913]: E1001 12:56:07.646752 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67969aa2-bca2-4e09-b0c7-cc8949849046" containerName="nova-scheduler-scheduler" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.646771 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="67969aa2-bca2-4e09-b0c7-cc8949849046" containerName="nova-scheduler-scheduler" Oct 01 12:56:07 crc kubenswrapper[4913]: E1001 12:56:07.646790 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d0dce13-24d2-41a8-8464-ea574800e22c" containerName="nova-api-log" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.646797 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d0dce13-24d2-41a8-8464-ea574800e22c" containerName="nova-api-log" Oct 01 12:56:07 crc kubenswrapper[4913]: E1001 12:56:07.646827 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d0dce13-24d2-41a8-8464-ea574800e22c" containerName="nova-api-api" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.646833 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d0dce13-24d2-41a8-8464-ea574800e22c" containerName="nova-api-api" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.647003 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="67969aa2-bca2-4e09-b0c7-cc8949849046" containerName="nova-scheduler-scheduler" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.647022 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d0dce13-24d2-41a8-8464-ea574800e22c" containerName="nova-api-log" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.647039 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d0dce13-24d2-41a8-8464-ea574800e22c" containerName="nova-api-api" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.648104 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.654210 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.659679 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.721541 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72bb9ee3-6f44-4a6e-a338-ae124d79498b-config-data\") pod \"nova-api-0\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.721667 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72bb9ee3-6f44-4a6e-a338-ae124d79498b-logs\") pod \"nova-api-0\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.721718 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72bb9ee3-6f44-4a6e-a338-ae124d79498b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.722015 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw75h\" (UniqueName: \"kubernetes.io/projected/72bb9ee3-6f44-4a6e-a338-ae124d79498b-kube-api-access-gw75h\") pod \"nova-api-0\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.744555 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-659bcf5cf5-8vzps" podUID="19e92ca3-0ea8-408d-902d-d0c6e283129f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.164:5353: i/o timeout" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.823877 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72bb9ee3-6f44-4a6e-a338-ae124d79498b-logs\") pod \"nova-api-0\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.823964 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72bb9ee3-6f44-4a6e-a338-ae124d79498b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.824048 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw75h\" (UniqueName: \"kubernetes.io/projected/72bb9ee3-6f44-4a6e-a338-ae124d79498b-kube-api-access-gw75h\") pod \"nova-api-0\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.824072 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72bb9ee3-6f44-4a6e-a338-ae124d79498b-config-data\") pod \"nova-api-0\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.824386 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72bb9ee3-6f44-4a6e-a338-ae124d79498b-logs\") pod \"nova-api-0\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.830164 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72bb9ee3-6f44-4a6e-a338-ae124d79498b-config-data\") pod \"nova-api-0\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.830693 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72bb9ee3-6f44-4a6e-a338-ae124d79498b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.841287 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw75h\" (UniqueName: \"kubernetes.io/projected/72bb9ee3-6f44-4a6e-a338-ae124d79498b-kube-api-access-gw75h\") pod \"nova-api-0\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " pod="openstack/nova-api-0" Oct 01 12:56:07 crc kubenswrapper[4913]: I1001 12:56:07.968593 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.278763 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"67969aa2-bca2-4e09-b0c7-cc8949849046","Type":"ContainerDied","Data":"9b4a2d892a66304159fab5b24ba08917a1407f9e197844f628c7621e3db5c53f"} Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.278794 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.278835 4913 scope.go:117] "RemoveContainer" containerID="5c106e6cc49513439675197ca109e98afc7deb97981069dcf76d3e1f0176e8b4" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.281347 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"57b23d8d-d51d-474a-b84d-644e88b745d5","Type":"ContainerStarted","Data":"972d38b7594739f52591c153a4e2627805516c29cd9091d91d8e914bc0440179"} Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.281381 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"57b23d8d-d51d-474a-b84d-644e88b745d5","Type":"ContainerStarted","Data":"ec751f59c943396157f61850f6335f6e4e8c73e9ae7b87089da75c9370d93d7e"} Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.313242 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.313217979 podStartE2EDuration="2.313217979s" podCreationTimestamp="2025-10-01 12:56:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:08.304816708 +0000 UTC m=+1100.208292296" watchObservedRunningTime="2025-10-01 12:56:08.313217979 +0000 UTC m=+1100.216693567" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.346724 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.356656 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.365215 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.367167 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.370621 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.374145 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.399762 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.442934 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/741b4306-efcc-4357-af8d-d74daab515ea-config-data\") pod \"nova-scheduler-0\" (UID: \"741b4306-efcc-4357-af8d-d74daab515ea\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.443047 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741b4306-efcc-4357-af8d-d74daab515ea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"741b4306-efcc-4357-af8d-d74daab515ea\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.443187 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4np4\" (UniqueName: \"kubernetes.io/projected/741b4306-efcc-4357-af8d-d74daab515ea-kube-api-access-q4np4\") pod \"nova-scheduler-0\" (UID: \"741b4306-efcc-4357-af8d-d74daab515ea\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.544342 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4np4\" (UniqueName: \"kubernetes.io/projected/741b4306-efcc-4357-af8d-d74daab515ea-kube-api-access-q4np4\") pod \"nova-scheduler-0\" (UID: \"741b4306-efcc-4357-af8d-d74daab515ea\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.544476 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/741b4306-efcc-4357-af8d-d74daab515ea-config-data\") pod \"nova-scheduler-0\" (UID: \"741b4306-efcc-4357-af8d-d74daab515ea\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.544516 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741b4306-efcc-4357-af8d-d74daab515ea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"741b4306-efcc-4357-af8d-d74daab515ea\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.549103 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741b4306-efcc-4357-af8d-d74daab515ea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"741b4306-efcc-4357-af8d-d74daab515ea\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.557573 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/741b4306-efcc-4357-af8d-d74daab515ea-config-data\") pod \"nova-scheduler-0\" (UID: \"741b4306-efcc-4357-af8d-d74daab515ea\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.561186 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4np4\" (UniqueName: \"kubernetes.io/projected/741b4306-efcc-4357-af8d-d74daab515ea-kube-api-access-q4np4\") pod \"nova-scheduler-0\" (UID: \"741b4306-efcc-4357-af8d-d74daab515ea\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.761377 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.824925 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67969aa2-bca2-4e09-b0c7-cc8949849046" path="/var/lib/kubelet/pods/67969aa2-bca2-4e09-b0c7-cc8949849046/volumes" Oct 01 12:56:08 crc kubenswrapper[4913]: I1001 12:56:08.825957 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d0dce13-24d2-41a8-8464-ea574800e22c" path="/var/lib/kubelet/pods/8d0dce13-24d2-41a8-8464-ea574800e22c/volumes" Oct 01 12:56:09 crc kubenswrapper[4913]: I1001 12:56:09.173707 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:09 crc kubenswrapper[4913]: W1001 12:56:09.177492 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod741b4306_efcc_4357_af8d_d74daab515ea.slice/crio-66e9d902b5cde517349af0b3ec824513125f5f6b256bb1208c92e449f734b584 WatchSource:0}: Error finding container 66e9d902b5cde517349af0b3ec824513125f5f6b256bb1208c92e449f734b584: Status 404 returned error can't find the container with id 66e9d902b5cde517349af0b3ec824513125f5f6b256bb1208c92e449f734b584 Oct 01 12:56:09 crc kubenswrapper[4913]: I1001 12:56:09.293487 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72bb9ee3-6f44-4a6e-a338-ae124d79498b","Type":"ContainerStarted","Data":"94908159d949400adf228c15e33961bba9c3112ebecbb48a478ccfc48cab234f"} Oct 01 12:56:09 crc kubenswrapper[4913]: I1001 12:56:09.293843 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72bb9ee3-6f44-4a6e-a338-ae124d79498b","Type":"ContainerStarted","Data":"6c61615400e81dab8e72eb7ca03ff7406cb30d4ceaafeeafe6fc0f93ab617ab9"} Oct 01 12:56:09 crc kubenswrapper[4913]: I1001 12:56:09.293857 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72bb9ee3-6f44-4a6e-a338-ae124d79498b","Type":"ContainerStarted","Data":"9db6c342e2b58e504d1a1ed749ca29f13a141302f95f1e2d968845c690b29e17"} Oct 01 12:56:09 crc kubenswrapper[4913]: I1001 12:56:09.298177 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"741b4306-efcc-4357-af8d-d74daab515ea","Type":"ContainerStarted","Data":"66e9d902b5cde517349af0b3ec824513125f5f6b256bb1208c92e449f734b584"} Oct 01 12:56:09 crc kubenswrapper[4913]: I1001 12:56:09.312793 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.312776353 podStartE2EDuration="2.312776353s" podCreationTimestamp="2025-10-01 12:56:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:09.312320291 +0000 UTC m=+1101.215795879" watchObservedRunningTime="2025-10-01 12:56:09.312776353 +0000 UTC m=+1101.216251931" Oct 01 12:56:10 crc kubenswrapper[4913]: I1001 12:56:10.083647 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:56:10 crc kubenswrapper[4913]: I1001 12:56:10.084012 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:56:10 crc kubenswrapper[4913]: I1001 12:56:10.308892 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"741b4306-efcc-4357-af8d-d74daab515ea","Type":"ContainerStarted","Data":"c1d643f65e3d824697352d51f6ff065839ab2397d310f2c8d7768785957400d4"} Oct 01 12:56:10 crc kubenswrapper[4913]: I1001 12:56:10.333659 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.333642772 podStartE2EDuration="2.333642772s" podCreationTimestamp="2025-10-01 12:56:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:10.324493811 +0000 UTC m=+1102.227969429" watchObservedRunningTime="2025-10-01 12:56:10.333642772 +0000 UTC m=+1102.237118350" Oct 01 12:56:11 crc kubenswrapper[4913]: I1001 12:56:11.218825 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 12:56:11 crc kubenswrapper[4913]: I1001 12:56:11.714372 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 12:56:11 crc kubenswrapper[4913]: I1001 12:56:11.714832 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 12:56:13 crc kubenswrapper[4913]: I1001 12:56:13.517955 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:56:13 crc kubenswrapper[4913]: I1001 12:56:13.518221 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="96fdbf7e-8781-4743-a69c-e56b650fb429" containerName="kube-state-metrics" containerID="cri-o://0ae55ac73c73fa949966ee095924415c34a989f84ed66198458548d86fa90149" gracePeriod=30 Oct 01 12:56:13 crc kubenswrapper[4913]: I1001 12:56:13.761473 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.070094 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.150865 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjvz4\" (UniqueName: \"kubernetes.io/projected/96fdbf7e-8781-4743-a69c-e56b650fb429-kube-api-access-vjvz4\") pod \"96fdbf7e-8781-4743-a69c-e56b650fb429\" (UID: \"96fdbf7e-8781-4743-a69c-e56b650fb429\") " Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.156189 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96fdbf7e-8781-4743-a69c-e56b650fb429-kube-api-access-vjvz4" (OuterVolumeSpecName: "kube-api-access-vjvz4") pod "96fdbf7e-8781-4743-a69c-e56b650fb429" (UID: "96fdbf7e-8781-4743-a69c-e56b650fb429"). InnerVolumeSpecName "kube-api-access-vjvz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.253631 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjvz4\" (UniqueName: \"kubernetes.io/projected/96fdbf7e-8781-4743-a69c-e56b650fb429-kube-api-access-vjvz4\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.355726 4913 generic.go:334] "Generic (PLEG): container finished" podID="d06000dd-9a73-4695-a477-0f361c61cf57" containerID="22b594d3a847571776800fd6b7741c6cebdb1088a933b58e14ad14dd505c11e3" exitCode=0 Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.355801 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qz2dw" event={"ID":"d06000dd-9a73-4695-a477-0f361c61cf57","Type":"ContainerDied","Data":"22b594d3a847571776800fd6b7741c6cebdb1088a933b58e14ad14dd505c11e3"} Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.357377 4913 generic.go:334] "Generic (PLEG): container finished" podID="96fdbf7e-8781-4743-a69c-e56b650fb429" containerID="0ae55ac73c73fa949966ee095924415c34a989f84ed66198458548d86fa90149" exitCode=2 Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.357440 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"96fdbf7e-8781-4743-a69c-e56b650fb429","Type":"ContainerDied","Data":"0ae55ac73c73fa949966ee095924415c34a989f84ed66198458548d86fa90149"} Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.357479 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"96fdbf7e-8781-4743-a69c-e56b650fb429","Type":"ContainerDied","Data":"0bc35e333c156d82100adfbe7c30947cf62517717bb45bed4973f9c25432a4e5"} Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.357507 4913 scope.go:117] "RemoveContainer" containerID="0ae55ac73c73fa949966ee095924415c34a989f84ed66198458548d86fa90149" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.357677 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.398866 4913 scope.go:117] "RemoveContainer" containerID="0ae55ac73c73fa949966ee095924415c34a989f84ed66198458548d86fa90149" Oct 01 12:56:14 crc kubenswrapper[4913]: E1001 12:56:14.399920 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae55ac73c73fa949966ee095924415c34a989f84ed66198458548d86fa90149\": container with ID starting with 0ae55ac73c73fa949966ee095924415c34a989f84ed66198458548d86fa90149 not found: ID does not exist" containerID="0ae55ac73c73fa949966ee095924415c34a989f84ed66198458548d86fa90149" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.399992 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae55ac73c73fa949966ee095924415c34a989f84ed66198458548d86fa90149"} err="failed to get container status \"0ae55ac73c73fa949966ee095924415c34a989f84ed66198458548d86fa90149\": rpc error: code = NotFound desc = could not find container \"0ae55ac73c73fa949966ee095924415c34a989f84ed66198458548d86fa90149\": container with ID starting with 0ae55ac73c73fa949966ee095924415c34a989f84ed66198458548d86fa90149 not found: ID does not exist" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.405029 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.416511 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.428023 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:56:14 crc kubenswrapper[4913]: E1001 12:56:14.428482 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96fdbf7e-8781-4743-a69c-e56b650fb429" containerName="kube-state-metrics" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.428502 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fdbf7e-8781-4743-a69c-e56b650fb429" containerName="kube-state-metrics" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.428694 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="96fdbf7e-8781-4743-a69c-e56b650fb429" containerName="kube-state-metrics" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.429227 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.431770 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.431935 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.453185 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.558985 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b0093d5f-95cb-4a40-877b-01ddb11c929b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b0093d5f-95cb-4a40-877b-01ddb11c929b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.559053 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj8z8\" (UniqueName: \"kubernetes.io/projected/b0093d5f-95cb-4a40-877b-01ddb11c929b-kube-api-access-qj8z8\") pod \"kube-state-metrics-0\" (UID: \"b0093d5f-95cb-4a40-877b-01ddb11c929b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.559388 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0093d5f-95cb-4a40-877b-01ddb11c929b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b0093d5f-95cb-4a40-877b-01ddb11c929b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.559734 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0093d5f-95cb-4a40-877b-01ddb11c929b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b0093d5f-95cb-4a40-877b-01ddb11c929b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.567325 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.567594 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="ceilometer-central-agent" containerID="cri-o://21d22c7b0bc899eca2c99d21c39f495b456ae359e92f84211949f2bb11f47ce6" gracePeriod=30 Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.567676 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="ceilometer-notification-agent" containerID="cri-o://1be0ad7d5fe4b12c23b476e46e37b7e18d4648f39f9e396c6a617eb9465ee0a6" gracePeriod=30 Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.567698 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="sg-core" containerID="cri-o://9c6efa8a635a2223c347cc4c2dff7b60119f4853060b4b3adf67681b29537356" gracePeriod=30 Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.567870 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="proxy-httpd" containerID="cri-o://79a793b0f411caaf80f2818cae3f8396c991997c761ed8af617553b5a8df2b41" gracePeriod=30 Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.661550 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b0093d5f-95cb-4a40-877b-01ddb11c929b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b0093d5f-95cb-4a40-877b-01ddb11c929b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.661608 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj8z8\" (UniqueName: \"kubernetes.io/projected/b0093d5f-95cb-4a40-877b-01ddb11c929b-kube-api-access-qj8z8\") pod \"kube-state-metrics-0\" (UID: \"b0093d5f-95cb-4a40-877b-01ddb11c929b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.661690 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0093d5f-95cb-4a40-877b-01ddb11c929b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b0093d5f-95cb-4a40-877b-01ddb11c929b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.661781 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0093d5f-95cb-4a40-877b-01ddb11c929b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b0093d5f-95cb-4a40-877b-01ddb11c929b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.666169 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b0093d5f-95cb-4a40-877b-01ddb11c929b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b0093d5f-95cb-4a40-877b-01ddb11c929b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.666605 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0093d5f-95cb-4a40-877b-01ddb11c929b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b0093d5f-95cb-4a40-877b-01ddb11c929b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.668009 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0093d5f-95cb-4a40-877b-01ddb11c929b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b0093d5f-95cb-4a40-877b-01ddb11c929b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.683142 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj8z8\" (UniqueName: \"kubernetes.io/projected/b0093d5f-95cb-4a40-877b-01ddb11c929b-kube-api-access-qj8z8\") pod \"kube-state-metrics-0\" (UID: \"b0093d5f-95cb-4a40-877b-01ddb11c929b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.746588 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 12:56:14 crc kubenswrapper[4913]: I1001 12:56:14.818523 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96fdbf7e-8781-4743-a69c-e56b650fb429" path="/var/lib/kubelet/pods/96fdbf7e-8781-4743-a69c-e56b650fb429/volumes" Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.214003 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:56:15 crc kubenswrapper[4913]: W1001 12:56:15.215782 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0093d5f_95cb_4a40_877b_01ddb11c929b.slice/crio-f6470c65bbd37f569ea59a75f9abc88371ce85dc2dcc133e2bda0080d3a6efc6 WatchSource:0}: Error finding container f6470c65bbd37f569ea59a75f9abc88371ce85dc2dcc133e2bda0080d3a6efc6: Status 404 returned error can't find the container with id f6470c65bbd37f569ea59a75f9abc88371ce85dc2dcc133e2bda0080d3a6efc6 Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.368086 4913 generic.go:334] "Generic (PLEG): container finished" podID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerID="79a793b0f411caaf80f2818cae3f8396c991997c761ed8af617553b5a8df2b41" exitCode=0 Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.368118 4913 generic.go:334] "Generic (PLEG): container finished" podID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerID="9c6efa8a635a2223c347cc4c2dff7b60119f4853060b4b3adf67681b29537356" exitCode=2 Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.368126 4913 generic.go:334] "Generic (PLEG): container finished" podID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerID="21d22c7b0bc899eca2c99d21c39f495b456ae359e92f84211949f2bb11f47ce6" exitCode=0 Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.368158 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a245020b-93c0-45eb-8ccc-ec2ed1c8392c","Type":"ContainerDied","Data":"79a793b0f411caaf80f2818cae3f8396c991997c761ed8af617553b5a8df2b41"} Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.368182 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a245020b-93c0-45eb-8ccc-ec2ed1c8392c","Type":"ContainerDied","Data":"9c6efa8a635a2223c347cc4c2dff7b60119f4853060b4b3adf67681b29537356"} Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.368191 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a245020b-93c0-45eb-8ccc-ec2ed1c8392c","Type":"ContainerDied","Data":"21d22c7b0bc899eca2c99d21c39f495b456ae359e92f84211949f2bb11f47ce6"} Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.369219 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b0093d5f-95cb-4a40-877b-01ddb11c929b","Type":"ContainerStarted","Data":"f6470c65bbd37f569ea59a75f9abc88371ce85dc2dcc133e2bda0080d3a6efc6"} Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.648387 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.784393 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-config-data\") pod \"d06000dd-9a73-4695-a477-0f361c61cf57\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.784802 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-scripts\") pod \"d06000dd-9a73-4695-a477-0f361c61cf57\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.784853 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w6gs\" (UniqueName: \"kubernetes.io/projected/d06000dd-9a73-4695-a477-0f361c61cf57-kube-api-access-7w6gs\") pod \"d06000dd-9a73-4695-a477-0f361c61cf57\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.784901 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-combined-ca-bundle\") pod \"d06000dd-9a73-4695-a477-0f361c61cf57\" (UID: \"d06000dd-9a73-4695-a477-0f361c61cf57\") " Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.790470 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-scripts" (OuterVolumeSpecName: "scripts") pod "d06000dd-9a73-4695-a477-0f361c61cf57" (UID: "d06000dd-9a73-4695-a477-0f361c61cf57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.790522 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06000dd-9a73-4695-a477-0f361c61cf57-kube-api-access-7w6gs" (OuterVolumeSpecName: "kube-api-access-7w6gs") pod "d06000dd-9a73-4695-a477-0f361c61cf57" (UID: "d06000dd-9a73-4695-a477-0f361c61cf57"). InnerVolumeSpecName "kube-api-access-7w6gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.817220 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-config-data" (OuterVolumeSpecName: "config-data") pod "d06000dd-9a73-4695-a477-0f361c61cf57" (UID: "d06000dd-9a73-4695-a477-0f361c61cf57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.822470 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d06000dd-9a73-4695-a477-0f361c61cf57" (UID: "d06000dd-9a73-4695-a477-0f361c61cf57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.886517 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.886549 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.886558 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w6gs\" (UniqueName: \"kubernetes.io/projected/d06000dd-9a73-4695-a477-0f361c61cf57-kube-api-access-7w6gs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:15 crc kubenswrapper[4913]: I1001 12:56:15.886567 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06000dd-9a73-4695-a477-0f361c61cf57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.382174 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b0093d5f-95cb-4a40-877b-01ddb11c929b","Type":"ContainerStarted","Data":"b2dd417801de56cdc3545994bb2761717010a154268ec70f3c1a7fc451b24f99"} Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.382643 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.384595 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qz2dw" event={"ID":"d06000dd-9a73-4695-a477-0f361c61cf57","Type":"ContainerDied","Data":"b9252322212e94033b1b3cc97a95fdc2489a552d66fb90d762a86be9ba602e5c"} Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.384633 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9252322212e94033b1b3cc97a95fdc2489a552d66fb90d762a86be9ba602e5c" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.384670 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qz2dw" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.408851 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.538838496 podStartE2EDuration="2.408830125s" podCreationTimestamp="2025-10-01 12:56:14 +0000 UTC" firstStartedPulling="2025-10-01 12:56:15.218130904 +0000 UTC m=+1107.121606472" lastFinishedPulling="2025-10-01 12:56:16.088122523 +0000 UTC m=+1107.991598101" observedRunningTime="2025-10-01 12:56:16.402655065 +0000 UTC m=+1108.306130663" watchObservedRunningTime="2025-10-01 12:56:16.408830125 +0000 UTC m=+1108.312305703" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.445744 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 12:56:16 crc kubenswrapper[4913]: E1001 12:56:16.446151 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06000dd-9a73-4695-a477-0f361c61cf57" containerName="nova-cell1-conductor-db-sync" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.446164 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06000dd-9a73-4695-a477-0f361c61cf57" containerName="nova-cell1-conductor-db-sync" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.446330 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06000dd-9a73-4695-a477-0f361c61cf57" containerName="nova-cell1-conductor-db-sync" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.446913 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.448660 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.456042 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.496850 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde6493b-0af5-4f13-9766-fde504bf6abc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bde6493b-0af5-4f13-9766-fde504bf6abc\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.496888 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bde6493b-0af5-4f13-9766-fde504bf6abc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bde6493b-0af5-4f13-9766-fde504bf6abc\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.497128 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp5b5\" (UniqueName: \"kubernetes.io/projected/bde6493b-0af5-4f13-9766-fde504bf6abc-kube-api-access-vp5b5\") pod \"nova-cell1-conductor-0\" (UID: \"bde6493b-0af5-4f13-9766-fde504bf6abc\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.599142 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp5b5\" (UniqueName: \"kubernetes.io/projected/bde6493b-0af5-4f13-9766-fde504bf6abc-kube-api-access-vp5b5\") pod \"nova-cell1-conductor-0\" (UID: \"bde6493b-0af5-4f13-9766-fde504bf6abc\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.599463 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde6493b-0af5-4f13-9766-fde504bf6abc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bde6493b-0af5-4f13-9766-fde504bf6abc\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.599489 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bde6493b-0af5-4f13-9766-fde504bf6abc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bde6493b-0af5-4f13-9766-fde504bf6abc\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.605012 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde6493b-0af5-4f13-9766-fde504bf6abc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bde6493b-0af5-4f13-9766-fde504bf6abc\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.605671 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bde6493b-0af5-4f13-9766-fde504bf6abc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bde6493b-0af5-4f13-9766-fde504bf6abc\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.615253 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp5b5\" (UniqueName: \"kubernetes.io/projected/bde6493b-0af5-4f13-9766-fde504bf6abc-kube-api-access-vp5b5\") pod \"nova-cell1-conductor-0\" (UID: \"bde6493b-0af5-4f13-9766-fde504bf6abc\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.713991 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.714030 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 12:56:16 crc kubenswrapper[4913]: I1001 12:56:16.761586 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 12:56:17 crc kubenswrapper[4913]: W1001 12:56:17.218830 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbde6493b_0af5_4f13_9766_fde504bf6abc.slice/crio-c468d846133029a5ed2db1e0785f9c48541f8c46fc379eae42937060a22898c3 WatchSource:0}: Error finding container c468d846133029a5ed2db1e0785f9c48541f8c46fc379eae42937060a22898c3: Status 404 returned error can't find the container with id c468d846133029a5ed2db1e0785f9c48541f8c46fc379eae42937060a22898c3 Oct 01 12:56:17 crc kubenswrapper[4913]: I1001 12:56:17.220705 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 12:56:17 crc kubenswrapper[4913]: I1001 12:56:17.393460 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bde6493b-0af5-4f13-9766-fde504bf6abc","Type":"ContainerStarted","Data":"c468d846133029a5ed2db1e0785f9c48541f8c46fc379eae42937060a22898c3"} Oct 01 12:56:17 crc kubenswrapper[4913]: I1001 12:56:17.728478 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="57b23d8d-d51d-474a-b84d-644e88b745d5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.180:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 12:56:17 crc kubenswrapper[4913]: I1001 12:56:17.728495 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="57b23d8d-d51d-474a-b84d-644e88b745d5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.180:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 12:56:17 crc kubenswrapper[4913]: I1001 12:56:17.969746 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 12:56:17 crc kubenswrapper[4913]: I1001 12:56:17.969812 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 12:56:18 crc kubenswrapper[4913]: I1001 12:56:18.408646 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bde6493b-0af5-4f13-9766-fde504bf6abc","Type":"ContainerStarted","Data":"c5b3f1ca79d9328c0a7e4ee9ce246d6f4d490a4df4303588c767ccb713d2d006"} Oct 01 12:56:18 crc kubenswrapper[4913]: I1001 12:56:18.409021 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 01 12:56:18 crc kubenswrapper[4913]: I1001 12:56:18.433723 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.433702061 podStartE2EDuration="2.433702061s" podCreationTimestamp="2025-10-01 12:56:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:18.423367807 +0000 UTC m=+1110.326843405" watchObservedRunningTime="2025-10-01 12:56:18.433702061 +0000 UTC m=+1110.337177639" Oct 01 12:56:18 crc kubenswrapper[4913]: I1001 12:56:18.762084 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 12:56:18 crc kubenswrapper[4913]: I1001 12:56:18.795674 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.052537 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="72bb9ee3-6f44-4a6e-a338-ae124d79498b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.052538 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="72bb9ee3-6f44-4a6e-a338-ae124d79498b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.181957 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.357220 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-sg-core-conf-yaml\") pod \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.357567 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-run-httpd\") pod \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.357657 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs9tl\" (UniqueName: \"kubernetes.io/projected/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-kube-api-access-gs9tl\") pod \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.357682 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-log-httpd\") pod \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.357739 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-config-data\") pod \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.357850 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-scripts\") pod \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.357883 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-combined-ca-bundle\") pod \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\" (UID: \"a245020b-93c0-45eb-8ccc-ec2ed1c8392c\") " Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.357979 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a245020b-93c0-45eb-8ccc-ec2ed1c8392c" (UID: "a245020b-93c0-45eb-8ccc-ec2ed1c8392c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.358387 4913 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.363256 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-scripts" (OuterVolumeSpecName: "scripts") pod "a245020b-93c0-45eb-8ccc-ec2ed1c8392c" (UID: "a245020b-93c0-45eb-8ccc-ec2ed1c8392c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.364520 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-kube-api-access-gs9tl" (OuterVolumeSpecName: "kube-api-access-gs9tl") pod "a245020b-93c0-45eb-8ccc-ec2ed1c8392c" (UID: "a245020b-93c0-45eb-8ccc-ec2ed1c8392c"). InnerVolumeSpecName "kube-api-access-gs9tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.390682 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a245020b-93c0-45eb-8ccc-ec2ed1c8392c" (UID: "a245020b-93c0-45eb-8ccc-ec2ed1c8392c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.400972 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a245020b-93c0-45eb-8ccc-ec2ed1c8392c" (UID: "a245020b-93c0-45eb-8ccc-ec2ed1c8392c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.420733 4913 generic.go:334] "Generic (PLEG): container finished" podID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerID="1be0ad7d5fe4b12c23b476e46e37b7e18d4648f39f9e396c6a617eb9465ee0a6" exitCode=0 Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.420787 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.421028 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a245020b-93c0-45eb-8ccc-ec2ed1c8392c","Type":"ContainerDied","Data":"1be0ad7d5fe4b12c23b476e46e37b7e18d4648f39f9e396c6a617eb9465ee0a6"} Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.421152 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a245020b-93c0-45eb-8ccc-ec2ed1c8392c","Type":"ContainerDied","Data":"46f1f5be0f1b7a073aa6853acaa70f0ab7c3a0137cad980b6a5582b4cdf123ef"} Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.421202 4913 scope.go:117] "RemoveContainer" containerID="79a793b0f411caaf80f2818cae3f8396c991997c761ed8af617553b5a8df2b41" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.455285 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.460068 4913 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.460262 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs9tl\" (UniqueName: \"kubernetes.io/projected/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-kube-api-access-gs9tl\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.460305 4913 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.460357 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.557404 4913 scope.go:117] "RemoveContainer" containerID="9c6efa8a635a2223c347cc4c2dff7b60119f4853060b4b3adf67681b29537356" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.570579 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a245020b-93c0-45eb-8ccc-ec2ed1c8392c" (UID: "a245020b-93c0-45eb-8ccc-ec2ed1c8392c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.592940 4913 scope.go:117] "RemoveContainer" containerID="1be0ad7d5fe4b12c23b476e46e37b7e18d4648f39f9e396c6a617eb9465ee0a6" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.602962 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-config-data" (OuterVolumeSpecName: "config-data") pod "a245020b-93c0-45eb-8ccc-ec2ed1c8392c" (UID: "a245020b-93c0-45eb-8ccc-ec2ed1c8392c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.666920 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.666952 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a245020b-93c0-45eb-8ccc-ec2ed1c8392c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.761992 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.782405 4913 scope.go:117] "RemoveContainer" containerID="21d22c7b0bc899eca2c99d21c39f495b456ae359e92f84211949f2bb11f47ce6" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.798680 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.814365 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:19 crc kubenswrapper[4913]: E1001 12:56:19.814860 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="ceilometer-notification-agent" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.814884 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="ceilometer-notification-agent" Oct 01 12:56:19 crc kubenswrapper[4913]: E1001 12:56:19.814912 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="proxy-httpd" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.814923 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="proxy-httpd" Oct 01 12:56:19 crc kubenswrapper[4913]: E1001 12:56:19.814962 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="ceilometer-central-agent" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.814971 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="ceilometer-central-agent" Oct 01 12:56:19 crc kubenswrapper[4913]: E1001 12:56:19.814986 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="sg-core" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.814993 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="sg-core" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.815205 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="proxy-httpd" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.815223 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="sg-core" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.815235 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="ceilometer-notification-agent" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.815246 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" containerName="ceilometer-central-agent" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.817573 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.821000 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.834932 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.835243 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.835381 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.861548 4913 scope.go:117] "RemoveContainer" containerID="79a793b0f411caaf80f2818cae3f8396c991997c761ed8af617553b5a8df2b41" Oct 01 12:56:19 crc kubenswrapper[4913]: E1001 12:56:19.863018 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79a793b0f411caaf80f2818cae3f8396c991997c761ed8af617553b5a8df2b41\": container with ID starting with 79a793b0f411caaf80f2818cae3f8396c991997c761ed8af617553b5a8df2b41 not found: ID does not exist" containerID="79a793b0f411caaf80f2818cae3f8396c991997c761ed8af617553b5a8df2b41" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.863056 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a793b0f411caaf80f2818cae3f8396c991997c761ed8af617553b5a8df2b41"} err="failed to get container status \"79a793b0f411caaf80f2818cae3f8396c991997c761ed8af617553b5a8df2b41\": rpc error: code = NotFound desc = could not find container \"79a793b0f411caaf80f2818cae3f8396c991997c761ed8af617553b5a8df2b41\": container with ID starting with 79a793b0f411caaf80f2818cae3f8396c991997c761ed8af617553b5a8df2b41 not found: ID does not exist" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.863081 4913 scope.go:117] "RemoveContainer" containerID="9c6efa8a635a2223c347cc4c2dff7b60119f4853060b4b3adf67681b29537356" Oct 01 12:56:19 crc kubenswrapper[4913]: E1001 12:56:19.863653 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c6efa8a635a2223c347cc4c2dff7b60119f4853060b4b3adf67681b29537356\": container with ID starting with 9c6efa8a635a2223c347cc4c2dff7b60119f4853060b4b3adf67681b29537356 not found: ID does not exist" containerID="9c6efa8a635a2223c347cc4c2dff7b60119f4853060b4b3adf67681b29537356" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.863690 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c6efa8a635a2223c347cc4c2dff7b60119f4853060b4b3adf67681b29537356"} err="failed to get container status \"9c6efa8a635a2223c347cc4c2dff7b60119f4853060b4b3adf67681b29537356\": rpc error: code = NotFound desc = could not find container \"9c6efa8a635a2223c347cc4c2dff7b60119f4853060b4b3adf67681b29537356\": container with ID starting with 9c6efa8a635a2223c347cc4c2dff7b60119f4853060b4b3adf67681b29537356 not found: ID does not exist" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.863714 4913 scope.go:117] "RemoveContainer" containerID="1be0ad7d5fe4b12c23b476e46e37b7e18d4648f39f9e396c6a617eb9465ee0a6" Oct 01 12:56:19 crc kubenswrapper[4913]: E1001 12:56:19.865139 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1be0ad7d5fe4b12c23b476e46e37b7e18d4648f39f9e396c6a617eb9465ee0a6\": container with ID starting with 1be0ad7d5fe4b12c23b476e46e37b7e18d4648f39f9e396c6a617eb9465ee0a6 not found: ID does not exist" containerID="1be0ad7d5fe4b12c23b476e46e37b7e18d4648f39f9e396c6a617eb9465ee0a6" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.865172 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1be0ad7d5fe4b12c23b476e46e37b7e18d4648f39f9e396c6a617eb9465ee0a6"} err="failed to get container status \"1be0ad7d5fe4b12c23b476e46e37b7e18d4648f39f9e396c6a617eb9465ee0a6\": rpc error: code = NotFound desc = could not find container \"1be0ad7d5fe4b12c23b476e46e37b7e18d4648f39f9e396c6a617eb9465ee0a6\": container with ID starting with 1be0ad7d5fe4b12c23b476e46e37b7e18d4648f39f9e396c6a617eb9465ee0a6 not found: ID does not exist" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.865192 4913 scope.go:117] "RemoveContainer" containerID="21d22c7b0bc899eca2c99d21c39f495b456ae359e92f84211949f2bb11f47ce6" Oct 01 12:56:19 crc kubenswrapper[4913]: E1001 12:56:19.865721 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21d22c7b0bc899eca2c99d21c39f495b456ae359e92f84211949f2bb11f47ce6\": container with ID starting with 21d22c7b0bc899eca2c99d21c39f495b456ae359e92f84211949f2bb11f47ce6 not found: ID does not exist" containerID="21d22c7b0bc899eca2c99d21c39f495b456ae359e92f84211949f2bb11f47ce6" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.865757 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d22c7b0bc899eca2c99d21c39f495b456ae359e92f84211949f2bb11f47ce6"} err="failed to get container status \"21d22c7b0bc899eca2c99d21c39f495b456ae359e92f84211949f2bb11f47ce6\": rpc error: code = NotFound desc = could not find container \"21d22c7b0bc899eca2c99d21c39f495b456ae359e92f84211949f2bb11f47ce6\": container with ID starting with 21d22c7b0bc899eca2c99d21c39f495b456ae359e92f84211949f2bb11f47ce6 not found: ID does not exist" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.971246 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-config-data\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.971338 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.971390 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.971443 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c737289-8db3-4914-b911-26256b36c4f4-run-httpd\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.971471 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-scripts\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.971498 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c737289-8db3-4914-b911-26256b36c4f4-log-httpd\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.971537 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:19 crc kubenswrapper[4913]: I1001 12:56:19.971563 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgqhr\" (UniqueName: \"kubernetes.io/projected/8c737289-8db3-4914-b911-26256b36c4f4-kube-api-access-jgqhr\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.073525 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.073586 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.073640 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c737289-8db3-4914-b911-26256b36c4f4-run-httpd\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.073678 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-scripts\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.073705 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c737289-8db3-4914-b911-26256b36c4f4-log-httpd\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.073751 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.073781 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgqhr\" (UniqueName: \"kubernetes.io/projected/8c737289-8db3-4914-b911-26256b36c4f4-kube-api-access-jgqhr\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.073954 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-config-data\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.074797 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c737289-8db3-4914-b911-26256b36c4f4-run-httpd\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.074879 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c737289-8db3-4914-b911-26256b36c4f4-log-httpd\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.081450 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-config-data\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.082180 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.082343 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.094909 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.095040 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-scripts\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.095175 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgqhr\" (UniqueName: \"kubernetes.io/projected/8c737289-8db3-4914-b911-26256b36c4f4-kube-api-access-jgqhr\") pod \"ceilometer-0\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.162250 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.632143 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:20 crc kubenswrapper[4913]: I1001 12:56:20.819667 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a245020b-93c0-45eb-8ccc-ec2ed1c8392c" path="/var/lib/kubelet/pods/a245020b-93c0-45eb-8ccc-ec2ed1c8392c/volumes" Oct 01 12:56:21 crc kubenswrapper[4913]: I1001 12:56:21.439648 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c737289-8db3-4914-b911-26256b36c4f4","Type":"ContainerStarted","Data":"9f0b789f1dc2c5817c532641d014592a943d378394c10bf3a48b9d3152d87165"} Oct 01 12:56:22 crc kubenswrapper[4913]: I1001 12:56:22.451058 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c737289-8db3-4914-b911-26256b36c4f4","Type":"ContainerStarted","Data":"b5bdbae6beec3ee7900b38685e582c4ba5a2e08e86a1c3581efe675f941ba58f"} Oct 01 12:56:22 crc kubenswrapper[4913]: I1001 12:56:22.451824 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c737289-8db3-4914-b911-26256b36c4f4","Type":"ContainerStarted","Data":"0cd12cea1239427ebf2cc663aee1f9347b4dbe2c72bae103cb91099597e7c2dd"} Oct 01 12:56:23 crc kubenswrapper[4913]: I1001 12:56:23.463054 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c737289-8db3-4914-b911-26256b36c4f4","Type":"ContainerStarted","Data":"a514b303af2fe43b36ec4d506e9a53fdf18fa9800afd580bb632f469e45baa87"} Oct 01 12:56:24 crc kubenswrapper[4913]: I1001 12:56:24.783262 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 12:56:25 crc kubenswrapper[4913]: I1001 12:56:25.487329 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c737289-8db3-4914-b911-26256b36c4f4","Type":"ContainerStarted","Data":"acb39ec7a76748ec144b6d51d945411256260212c8e37d98d50045c95d0d33f3"} Oct 01 12:56:25 crc kubenswrapper[4913]: I1001 12:56:25.487657 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:56:25 crc kubenswrapper[4913]: I1001 12:56:25.522326 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.543626267 podStartE2EDuration="6.522302847s" podCreationTimestamp="2025-10-01 12:56:19 +0000 UTC" firstStartedPulling="2025-10-01 12:56:20.642084913 +0000 UTC m=+1112.545560491" lastFinishedPulling="2025-10-01 12:56:24.620761493 +0000 UTC m=+1116.524237071" observedRunningTime="2025-10-01 12:56:25.514707118 +0000 UTC m=+1117.418182716" watchObservedRunningTime="2025-10-01 12:56:25.522302847 +0000 UTC m=+1117.425778425" Oct 01 12:56:26 crc kubenswrapper[4913]: I1001 12:56:26.721668 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 12:56:26 crc kubenswrapper[4913]: I1001 12:56:26.723454 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 12:56:26 crc kubenswrapper[4913]: I1001 12:56:26.727661 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 12:56:26 crc kubenswrapper[4913]: I1001 12:56:26.796047 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 01 12:56:27 crc kubenswrapper[4913]: I1001 12:56:27.512375 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 12:56:27 crc kubenswrapper[4913]: I1001 12:56:27.973568 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 12:56:27 crc kubenswrapper[4913]: I1001 12:56:27.973961 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 12:56:27 crc kubenswrapper[4913]: I1001 12:56:27.974616 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 12:56:27 crc kubenswrapper[4913]: I1001 12:56:27.978058 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.518196 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.521905 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.724193 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-657bf774d5-fwnrk"] Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.726057 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.738029 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-657bf774d5-fwnrk"] Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.833514 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-ovsdbserver-sb\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.833575 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gx4s\" (UniqueName: \"kubernetes.io/projected/96c0278e-106e-4985-a279-d8c89f141b15-kube-api-access-9gx4s\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.833609 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-ovsdbserver-nb\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.833993 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-config\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.834149 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-dns-svc\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.936406 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-dns-svc\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.936867 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-ovsdbserver-sb\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.936907 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gx4s\" (UniqueName: \"kubernetes.io/projected/96c0278e-106e-4985-a279-d8c89f141b15-kube-api-access-9gx4s\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.936947 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-ovsdbserver-nb\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.937032 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-config\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.937595 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-dns-svc\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.937958 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-config\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.937963 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-ovsdbserver-sb\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.938360 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-ovsdbserver-nb\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:28 crc kubenswrapper[4913]: I1001 12:56:28.964351 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gx4s\" (UniqueName: \"kubernetes.io/projected/96c0278e-106e-4985-a279-d8c89f141b15-kube-api-access-9gx4s\") pod \"dnsmasq-dns-657bf774d5-fwnrk\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:29 crc kubenswrapper[4913]: I1001 12:56:29.046372 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:29 crc kubenswrapper[4913]: I1001 12:56:29.538348 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-657bf774d5-fwnrk"] Oct 01 12:56:29 crc kubenswrapper[4913]: W1001 12:56:29.548975 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96c0278e_106e_4985_a279_d8c89f141b15.slice/crio-7d33d0fa1caaf165d91fa4ba1a79325255eb507e6f8d310743393d0a8273b1da WatchSource:0}: Error finding container 7d33d0fa1caaf165d91fa4ba1a79325255eb507e6f8d310743393d0a8273b1da: Status 404 returned error can't find the container with id 7d33d0fa1caaf165d91fa4ba1a79325255eb507e6f8d310743393d0a8273b1da Oct 01 12:56:30 crc kubenswrapper[4913]: I1001 12:56:30.536006 4913 generic.go:334] "Generic (PLEG): container finished" podID="96c0278e-106e-4985-a279-d8c89f141b15" containerID="894f6aedb67c103c854a2574a1c310305046efa5b006824c729877921ac82e0c" exitCode=0 Oct 01 12:56:30 crc kubenswrapper[4913]: I1001 12:56:30.536126 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" event={"ID":"96c0278e-106e-4985-a279-d8c89f141b15","Type":"ContainerDied","Data":"894f6aedb67c103c854a2574a1c310305046efa5b006824c729877921ac82e0c"} Oct 01 12:56:30 crc kubenswrapper[4913]: I1001 12:56:30.536424 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" event={"ID":"96c0278e-106e-4985-a279-d8c89f141b15","Type":"ContainerStarted","Data":"7d33d0fa1caaf165d91fa4ba1a79325255eb507e6f8d310743393d0a8273b1da"} Oct 01 12:56:30 crc kubenswrapper[4913]: I1001 12:56:30.875368 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:30 crc kubenswrapper[4913]: I1001 12:56:30.875944 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="ceilometer-central-agent" containerID="cri-o://0cd12cea1239427ebf2cc663aee1f9347b4dbe2c72bae103cb91099597e7c2dd" gracePeriod=30 Oct 01 12:56:30 crc kubenswrapper[4913]: I1001 12:56:30.876076 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="proxy-httpd" containerID="cri-o://acb39ec7a76748ec144b6d51d945411256260212c8e37d98d50045c95d0d33f3" gracePeriod=30 Oct 01 12:56:30 crc kubenswrapper[4913]: I1001 12:56:30.876116 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="sg-core" containerID="cri-o://a514b303af2fe43b36ec4d506e9a53fdf18fa9800afd580bb632f469e45baa87" gracePeriod=30 Oct 01 12:56:30 crc kubenswrapper[4913]: I1001 12:56:30.876143 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="ceilometer-notification-agent" containerID="cri-o://b5bdbae6beec3ee7900b38685e582c4ba5a2e08e86a1c3581efe675f941ba58f" gracePeriod=30 Oct 01 12:56:31 crc kubenswrapper[4913]: I1001 12:56:31.039118 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:31 crc kubenswrapper[4913]: I1001 12:56:31.545846 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" event={"ID":"96c0278e-106e-4985-a279-d8c89f141b15","Type":"ContainerStarted","Data":"198be700b826f0735785340d588b10cbe8173a162552e432ee8e68edf655ee98"} Oct 01 12:56:31 crc kubenswrapper[4913]: I1001 12:56:31.546178 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:31 crc kubenswrapper[4913]: I1001 12:56:31.549431 4913 generic.go:334] "Generic (PLEG): container finished" podID="8c737289-8db3-4914-b911-26256b36c4f4" containerID="acb39ec7a76748ec144b6d51d945411256260212c8e37d98d50045c95d0d33f3" exitCode=0 Oct 01 12:56:31 crc kubenswrapper[4913]: I1001 12:56:31.549459 4913 generic.go:334] "Generic (PLEG): container finished" podID="8c737289-8db3-4914-b911-26256b36c4f4" containerID="a514b303af2fe43b36ec4d506e9a53fdf18fa9800afd580bb632f469e45baa87" exitCode=2 Oct 01 12:56:31 crc kubenswrapper[4913]: I1001 12:56:31.549468 4913 generic.go:334] "Generic (PLEG): container finished" podID="8c737289-8db3-4914-b911-26256b36c4f4" containerID="0cd12cea1239427ebf2cc663aee1f9347b4dbe2c72bae103cb91099597e7c2dd" exitCode=0 Oct 01 12:56:31 crc kubenswrapper[4913]: I1001 12:56:31.549469 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c737289-8db3-4914-b911-26256b36c4f4","Type":"ContainerDied","Data":"acb39ec7a76748ec144b6d51d945411256260212c8e37d98d50045c95d0d33f3"} Oct 01 12:56:31 crc kubenswrapper[4913]: I1001 12:56:31.549508 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c737289-8db3-4914-b911-26256b36c4f4","Type":"ContainerDied","Data":"a514b303af2fe43b36ec4d506e9a53fdf18fa9800afd580bb632f469e45baa87"} Oct 01 12:56:31 crc kubenswrapper[4913]: I1001 12:56:31.549521 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c737289-8db3-4914-b911-26256b36c4f4","Type":"ContainerDied","Data":"0cd12cea1239427ebf2cc663aee1f9347b4dbe2c72bae103cb91099597e7c2dd"} Oct 01 12:56:31 crc kubenswrapper[4913]: I1001 12:56:31.549644 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="72bb9ee3-6f44-4a6e-a338-ae124d79498b" containerName="nova-api-log" containerID="cri-o://6c61615400e81dab8e72eb7ca03ff7406cb30d4ceaafeeafe6fc0f93ab617ab9" gracePeriod=30 Oct 01 12:56:31 crc kubenswrapper[4913]: I1001 12:56:31.549798 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="72bb9ee3-6f44-4a6e-a338-ae124d79498b" containerName="nova-api-api" containerID="cri-o://94908159d949400adf228c15e33961bba9c3112ebecbb48a478ccfc48cab234f" gracePeriod=30 Oct 01 12:56:31 crc kubenswrapper[4913]: I1001 12:56:31.570936 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" podStartSLOduration=3.57092097 podStartE2EDuration="3.57092097s" podCreationTimestamp="2025-10-01 12:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:31.567949649 +0000 UTC m=+1123.471425247" watchObservedRunningTime="2025-10-01 12:56:31.57092097 +0000 UTC m=+1123.474396548" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.168301 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.296622 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-ceilometer-tls-certs\") pod \"8c737289-8db3-4914-b911-26256b36c4f4\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.296922 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-scripts\") pod \"8c737289-8db3-4914-b911-26256b36c4f4\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.296958 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c737289-8db3-4914-b911-26256b36c4f4-run-httpd\") pod \"8c737289-8db3-4914-b911-26256b36c4f4\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.296976 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c737289-8db3-4914-b911-26256b36c4f4-log-httpd\") pod \"8c737289-8db3-4914-b911-26256b36c4f4\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.297029 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-combined-ca-bundle\") pod \"8c737289-8db3-4914-b911-26256b36c4f4\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.297085 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgqhr\" (UniqueName: \"kubernetes.io/projected/8c737289-8db3-4914-b911-26256b36c4f4-kube-api-access-jgqhr\") pod \"8c737289-8db3-4914-b911-26256b36c4f4\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.297102 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-sg-core-conf-yaml\") pod \"8c737289-8db3-4914-b911-26256b36c4f4\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.297146 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-config-data\") pod \"8c737289-8db3-4914-b911-26256b36c4f4\" (UID: \"8c737289-8db3-4914-b911-26256b36c4f4\") " Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.297523 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c737289-8db3-4914-b911-26256b36c4f4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8c737289-8db3-4914-b911-26256b36c4f4" (UID: "8c737289-8db3-4914-b911-26256b36c4f4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.298246 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c737289-8db3-4914-b911-26256b36c4f4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8c737289-8db3-4914-b911-26256b36c4f4" (UID: "8c737289-8db3-4914-b911-26256b36c4f4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.302874 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c737289-8db3-4914-b911-26256b36c4f4-kube-api-access-jgqhr" (OuterVolumeSpecName: "kube-api-access-jgqhr") pod "8c737289-8db3-4914-b911-26256b36c4f4" (UID: "8c737289-8db3-4914-b911-26256b36c4f4"). InnerVolumeSpecName "kube-api-access-jgqhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.305035 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-scripts" (OuterVolumeSpecName: "scripts") pod "8c737289-8db3-4914-b911-26256b36c4f4" (UID: "8c737289-8db3-4914-b911-26256b36c4f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.357201 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8c737289-8db3-4914-b911-26256b36c4f4" (UID: "8c737289-8db3-4914-b911-26256b36c4f4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.361480 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8c737289-8db3-4914-b911-26256b36c4f4" (UID: "8c737289-8db3-4914-b911-26256b36c4f4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.398894 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgqhr\" (UniqueName: \"kubernetes.io/projected/8c737289-8db3-4914-b911-26256b36c4f4-kube-api-access-jgqhr\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.398929 4913 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.398940 4913 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.398951 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.398961 4913 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c737289-8db3-4914-b911-26256b36c4f4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.398971 4913 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c737289-8db3-4914-b911-26256b36c4f4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.405745 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c737289-8db3-4914-b911-26256b36c4f4" (UID: "8c737289-8db3-4914-b911-26256b36c4f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.448735 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-config-data" (OuterVolumeSpecName: "config-data") pod "8c737289-8db3-4914-b911-26256b36c4f4" (UID: "8c737289-8db3-4914-b911-26256b36c4f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.500103 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.500135 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c737289-8db3-4914-b911-26256b36c4f4-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.558640 4913 generic.go:334] "Generic (PLEG): container finished" podID="72bb9ee3-6f44-4a6e-a338-ae124d79498b" containerID="6c61615400e81dab8e72eb7ca03ff7406cb30d4ceaafeeafe6fc0f93ab617ab9" exitCode=143 Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.558696 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72bb9ee3-6f44-4a6e-a338-ae124d79498b","Type":"ContainerDied","Data":"6c61615400e81dab8e72eb7ca03ff7406cb30d4ceaafeeafe6fc0f93ab617ab9"} Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.562149 4913 generic.go:334] "Generic (PLEG): container finished" podID="8c737289-8db3-4914-b911-26256b36c4f4" containerID="b5bdbae6beec3ee7900b38685e582c4ba5a2e08e86a1c3581efe675f941ba58f" exitCode=0 Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.562194 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.562251 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c737289-8db3-4914-b911-26256b36c4f4","Type":"ContainerDied","Data":"b5bdbae6beec3ee7900b38685e582c4ba5a2e08e86a1c3581efe675f941ba58f"} Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.562295 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c737289-8db3-4914-b911-26256b36c4f4","Type":"ContainerDied","Data":"9f0b789f1dc2c5817c532641d014592a943d378394c10bf3a48b9d3152d87165"} Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.562316 4913 scope.go:117] "RemoveContainer" containerID="acb39ec7a76748ec144b6d51d945411256260212c8e37d98d50045c95d0d33f3" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.593230 4913 scope.go:117] "RemoveContainer" containerID="a514b303af2fe43b36ec4d506e9a53fdf18fa9800afd580bb632f469e45baa87" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.612153 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.621384 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.626465 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:32 crc kubenswrapper[4913]: E1001 12:56:32.626906 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="proxy-httpd" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.626930 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="proxy-httpd" Oct 01 12:56:32 crc kubenswrapper[4913]: E1001 12:56:32.626963 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="sg-core" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.626971 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="sg-core" Oct 01 12:56:32 crc kubenswrapper[4913]: E1001 12:56:32.626987 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="ceilometer-notification-agent" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.626995 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="ceilometer-notification-agent" Oct 01 12:56:32 crc kubenswrapper[4913]: E1001 12:56:32.627011 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="ceilometer-central-agent" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.627019 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="ceilometer-central-agent" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.627210 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="ceilometer-notification-agent" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.627236 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="sg-core" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.627254 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="ceilometer-central-agent" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.627282 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c737289-8db3-4914-b911-26256b36c4f4" containerName="proxy-httpd" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.655062 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.657632 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.658310 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.658362 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.664305 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.664477 4913 scope.go:117] "RemoveContainer" containerID="b5bdbae6beec3ee7900b38685e582c4ba5a2e08e86a1c3581efe675f941ba58f" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.693368 4913 scope.go:117] "RemoveContainer" containerID="0cd12cea1239427ebf2cc663aee1f9347b4dbe2c72bae103cb91099597e7c2dd" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.711682 4913 scope.go:117] "RemoveContainer" containerID="acb39ec7a76748ec144b6d51d945411256260212c8e37d98d50045c95d0d33f3" Oct 01 12:56:32 crc kubenswrapper[4913]: E1001 12:56:32.712085 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb39ec7a76748ec144b6d51d945411256260212c8e37d98d50045c95d0d33f3\": container with ID starting with acb39ec7a76748ec144b6d51d945411256260212c8e37d98d50045c95d0d33f3 not found: ID does not exist" containerID="acb39ec7a76748ec144b6d51d945411256260212c8e37d98d50045c95d0d33f3" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.712180 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb39ec7a76748ec144b6d51d945411256260212c8e37d98d50045c95d0d33f3"} err="failed to get container status \"acb39ec7a76748ec144b6d51d945411256260212c8e37d98d50045c95d0d33f3\": rpc error: code = NotFound desc = could not find container \"acb39ec7a76748ec144b6d51d945411256260212c8e37d98d50045c95d0d33f3\": container with ID starting with acb39ec7a76748ec144b6d51d945411256260212c8e37d98d50045c95d0d33f3 not found: ID does not exist" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.712300 4913 scope.go:117] "RemoveContainer" containerID="a514b303af2fe43b36ec4d506e9a53fdf18fa9800afd580bb632f469e45baa87" Oct 01 12:56:32 crc kubenswrapper[4913]: E1001 12:56:32.712760 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a514b303af2fe43b36ec4d506e9a53fdf18fa9800afd580bb632f469e45baa87\": container with ID starting with a514b303af2fe43b36ec4d506e9a53fdf18fa9800afd580bb632f469e45baa87 not found: ID does not exist" containerID="a514b303af2fe43b36ec4d506e9a53fdf18fa9800afd580bb632f469e45baa87" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.712848 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a514b303af2fe43b36ec4d506e9a53fdf18fa9800afd580bb632f469e45baa87"} err="failed to get container status \"a514b303af2fe43b36ec4d506e9a53fdf18fa9800afd580bb632f469e45baa87\": rpc error: code = NotFound desc = could not find container \"a514b303af2fe43b36ec4d506e9a53fdf18fa9800afd580bb632f469e45baa87\": container with ID starting with a514b303af2fe43b36ec4d506e9a53fdf18fa9800afd580bb632f469e45baa87 not found: ID does not exist" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.712913 4913 scope.go:117] "RemoveContainer" containerID="b5bdbae6beec3ee7900b38685e582c4ba5a2e08e86a1c3581efe675f941ba58f" Oct 01 12:56:32 crc kubenswrapper[4913]: E1001 12:56:32.713188 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5bdbae6beec3ee7900b38685e582c4ba5a2e08e86a1c3581efe675f941ba58f\": container with ID starting with b5bdbae6beec3ee7900b38685e582c4ba5a2e08e86a1c3581efe675f941ba58f not found: ID does not exist" containerID="b5bdbae6beec3ee7900b38685e582c4ba5a2e08e86a1c3581efe675f941ba58f" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.713219 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5bdbae6beec3ee7900b38685e582c4ba5a2e08e86a1c3581efe675f941ba58f"} err="failed to get container status \"b5bdbae6beec3ee7900b38685e582c4ba5a2e08e86a1c3581efe675f941ba58f\": rpc error: code = NotFound desc = could not find container \"b5bdbae6beec3ee7900b38685e582c4ba5a2e08e86a1c3581efe675f941ba58f\": container with ID starting with b5bdbae6beec3ee7900b38685e582c4ba5a2e08e86a1c3581efe675f941ba58f not found: ID does not exist" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.713241 4913 scope.go:117] "RemoveContainer" containerID="0cd12cea1239427ebf2cc663aee1f9347b4dbe2c72bae103cb91099597e7c2dd" Oct 01 12:56:32 crc kubenswrapper[4913]: E1001 12:56:32.713688 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cd12cea1239427ebf2cc663aee1f9347b4dbe2c72bae103cb91099597e7c2dd\": container with ID starting with 0cd12cea1239427ebf2cc663aee1f9347b4dbe2c72bae103cb91099597e7c2dd not found: ID does not exist" containerID="0cd12cea1239427ebf2cc663aee1f9347b4dbe2c72bae103cb91099597e7c2dd" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.713775 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cd12cea1239427ebf2cc663aee1f9347b4dbe2c72bae103cb91099597e7c2dd"} err="failed to get container status \"0cd12cea1239427ebf2cc663aee1f9347b4dbe2c72bae103cb91099597e7c2dd\": rpc error: code = NotFound desc = could not find container \"0cd12cea1239427ebf2cc663aee1f9347b4dbe2c72bae103cb91099597e7c2dd\": container with ID starting with 0cd12cea1239427ebf2cc663aee1f9347b4dbe2c72bae103cb91099597e7c2dd not found: ID does not exist" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.804971 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6503749-0367-482c-9269-62360d2a6684-log-httpd\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.805610 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.805798 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-config-data\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.805899 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-scripts\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.805974 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.806037 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr7pf\" (UniqueName: \"kubernetes.io/projected/c6503749-0367-482c-9269-62360d2a6684-kube-api-access-nr7pf\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.806140 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6503749-0367-482c-9269-62360d2a6684-run-httpd\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.806213 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.818407 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c737289-8db3-4914-b911-26256b36c4f4" path="/var/lib/kubelet/pods/8c737289-8db3-4914-b911-26256b36c4f4/volumes" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.902386 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:32 crc kubenswrapper[4913]: E1001 12:56:32.902952 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-nr7pf log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="c6503749-0367-482c-9269-62360d2a6684" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.908763 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6503749-0367-482c-9269-62360d2a6684-run-httpd\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.908975 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.909121 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6503749-0367-482c-9269-62360d2a6684-log-httpd\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.909212 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.909376 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-config-data\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.909515 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-scripts\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.909597 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.909676 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr7pf\" (UniqueName: \"kubernetes.io/projected/c6503749-0367-482c-9269-62360d2a6684-kube-api-access-nr7pf\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.909832 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6503749-0367-482c-9269-62360d2a6684-log-httpd\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.909419 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6503749-0367-482c-9269-62360d2a6684-run-httpd\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.914759 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.914777 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.915014 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.915354 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-scripts\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.916136 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-config-data\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:32 crc kubenswrapper[4913]: I1001 12:56:32.925763 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr7pf\" (UniqueName: \"kubernetes.io/projected/c6503749-0367-482c-9269-62360d2a6684-kube-api-access-nr7pf\") pod \"ceilometer-0\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " pod="openstack/ceilometer-0" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.572491 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.589176 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.723071 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-combined-ca-bundle\") pod \"c6503749-0367-482c-9269-62360d2a6684\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.723337 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-config-data\") pod \"c6503749-0367-482c-9269-62360d2a6684\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.723519 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6503749-0367-482c-9269-62360d2a6684-run-httpd\") pod \"c6503749-0367-482c-9269-62360d2a6684\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.723597 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6503749-0367-482c-9269-62360d2a6684-log-httpd\") pod \"c6503749-0367-482c-9269-62360d2a6684\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.723764 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-scripts\") pod \"c6503749-0367-482c-9269-62360d2a6684\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.723809 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-ceilometer-tls-certs\") pod \"c6503749-0367-482c-9269-62360d2a6684\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.723939 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr7pf\" (UniqueName: \"kubernetes.io/projected/c6503749-0367-482c-9269-62360d2a6684-kube-api-access-nr7pf\") pod \"c6503749-0367-482c-9269-62360d2a6684\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.724047 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-sg-core-conf-yaml\") pod \"c6503749-0367-482c-9269-62360d2a6684\" (UID: \"c6503749-0367-482c-9269-62360d2a6684\") " Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.723937 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6503749-0367-482c-9269-62360d2a6684-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c6503749-0367-482c-9269-62360d2a6684" (UID: "c6503749-0367-482c-9269-62360d2a6684"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.723985 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6503749-0367-482c-9269-62360d2a6684-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c6503749-0367-482c-9269-62360d2a6684" (UID: "c6503749-0367-482c-9269-62360d2a6684"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.725105 4913 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6503749-0367-482c-9269-62360d2a6684-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.725138 4913 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6503749-0367-482c-9269-62360d2a6684-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.727591 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-scripts" (OuterVolumeSpecName: "scripts") pod "c6503749-0367-482c-9269-62360d2a6684" (UID: "c6503749-0367-482c-9269-62360d2a6684"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.727991 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-config-data" (OuterVolumeSpecName: "config-data") pod "c6503749-0367-482c-9269-62360d2a6684" (UID: "c6503749-0367-482c-9269-62360d2a6684"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.729356 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c6503749-0367-482c-9269-62360d2a6684" (UID: "c6503749-0367-482c-9269-62360d2a6684"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.729571 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6503749-0367-482c-9269-62360d2a6684-kube-api-access-nr7pf" (OuterVolumeSpecName: "kube-api-access-nr7pf") pod "c6503749-0367-482c-9269-62360d2a6684" (UID: "c6503749-0367-482c-9269-62360d2a6684"). InnerVolumeSpecName "kube-api-access-nr7pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.733410 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6503749-0367-482c-9269-62360d2a6684" (UID: "c6503749-0367-482c-9269-62360d2a6684"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.734333 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c6503749-0367-482c-9269-62360d2a6684" (UID: "c6503749-0367-482c-9269-62360d2a6684"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.827086 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.827131 4913 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.827144 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr7pf\" (UniqueName: \"kubernetes.io/projected/c6503749-0367-482c-9269-62360d2a6684-kube-api-access-nr7pf\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.827156 4913 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.827169 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:33 crc kubenswrapper[4913]: I1001 12:56:33.827180 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6503749-0367-482c-9269-62360d2a6684-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.585353 4913 generic.go:334] "Generic (PLEG): container finished" podID="122bc201-edae-47f2-a752-818ba02b0dea" containerID="3752308e4b5e6451986e149830f5ca336235e0ecbdd182c32a30d28192f1ef97" exitCode=137 Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.585476 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"122bc201-edae-47f2-a752-818ba02b0dea","Type":"ContainerDied","Data":"3752308e4b5e6451986e149830f5ca336235e0ecbdd182c32a30d28192f1ef97"} Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.585709 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.663761 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.673601 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.683833 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.686889 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.689677 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.690006 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.690160 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.699522 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.816848 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6503749-0367-482c-9269-62360d2a6684" path="/var/lib/kubelet/pods/c6503749-0367-482c-9269-62360d2a6684/volumes" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.845957 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-scripts\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.845998 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bdf40f-f949-40cf-8416-3a032c521d59-log-httpd\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.846027 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j9x8\" (UniqueName: \"kubernetes.io/projected/32bdf40f-f949-40cf-8416-3a032c521d59-kube-api-access-8j9x8\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.846083 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.846108 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bdf40f-f949-40cf-8416-3a032c521d59-run-httpd\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.846133 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.846156 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.846198 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-config-data\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.898233 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.947470 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-scripts\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.947528 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bdf40f-f949-40cf-8416-3a032c521d59-log-httpd\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.948147 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j9x8\" (UniqueName: \"kubernetes.io/projected/32bdf40f-f949-40cf-8416-3a032c521d59-kube-api-access-8j9x8\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.948331 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bdf40f-f949-40cf-8416-3a032c521d59-log-httpd\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.948538 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.949026 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bdf40f-f949-40cf-8416-3a032c521d59-run-httpd\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.949374 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.949426 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bdf40f-f949-40cf-8416-3a032c521d59-run-httpd\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.949412 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.949873 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-config-data\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.954179 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.954191 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-scripts\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.955512 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.965535 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j9x8\" (UniqueName: \"kubernetes.io/projected/32bdf40f-f949-40cf-8416-3a032c521d59-kube-api-access-8j9x8\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.965769 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-config-data\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:34 crc kubenswrapper[4913]: I1001 12:56:34.977690 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " pod="openstack/ceilometer-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.020291 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.053054 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nzf2\" (UniqueName: \"kubernetes.io/projected/122bc201-edae-47f2-a752-818ba02b0dea-kube-api-access-6nzf2\") pod \"122bc201-edae-47f2-a752-818ba02b0dea\" (UID: \"122bc201-edae-47f2-a752-818ba02b0dea\") " Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.053182 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122bc201-edae-47f2-a752-818ba02b0dea-combined-ca-bundle\") pod \"122bc201-edae-47f2-a752-818ba02b0dea\" (UID: \"122bc201-edae-47f2-a752-818ba02b0dea\") " Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.053245 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122bc201-edae-47f2-a752-818ba02b0dea-config-data\") pod \"122bc201-edae-47f2-a752-818ba02b0dea\" (UID: \"122bc201-edae-47f2-a752-818ba02b0dea\") " Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.057455 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/122bc201-edae-47f2-a752-818ba02b0dea-kube-api-access-6nzf2" (OuterVolumeSpecName: "kube-api-access-6nzf2") pod "122bc201-edae-47f2-a752-818ba02b0dea" (UID: "122bc201-edae-47f2-a752-818ba02b0dea"). InnerVolumeSpecName "kube-api-access-6nzf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.084524 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122bc201-edae-47f2-a752-818ba02b0dea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "122bc201-edae-47f2-a752-818ba02b0dea" (UID: "122bc201-edae-47f2-a752-818ba02b0dea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.092664 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122bc201-edae-47f2-a752-818ba02b0dea-config-data" (OuterVolumeSpecName: "config-data") pod "122bc201-edae-47f2-a752-818ba02b0dea" (UID: "122bc201-edae-47f2-a752-818ba02b0dea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.154369 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw75h\" (UniqueName: \"kubernetes.io/projected/72bb9ee3-6f44-4a6e-a338-ae124d79498b-kube-api-access-gw75h\") pod \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.154471 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72bb9ee3-6f44-4a6e-a338-ae124d79498b-combined-ca-bundle\") pod \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.154588 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72bb9ee3-6f44-4a6e-a338-ae124d79498b-config-data\") pod \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.154668 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72bb9ee3-6f44-4a6e-a338-ae124d79498b-logs\") pod \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\" (UID: \"72bb9ee3-6f44-4a6e-a338-ae124d79498b\") " Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.155032 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122bc201-edae-47f2-a752-818ba02b0dea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.155050 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122bc201-edae-47f2-a752-818ba02b0dea-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.155059 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nzf2\" (UniqueName: \"kubernetes.io/projected/122bc201-edae-47f2-a752-818ba02b0dea-kube-api-access-6nzf2\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.155358 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72bb9ee3-6f44-4a6e-a338-ae124d79498b-logs" (OuterVolumeSpecName: "logs") pod "72bb9ee3-6f44-4a6e-a338-ae124d79498b" (UID: "72bb9ee3-6f44-4a6e-a338-ae124d79498b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.157452 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72bb9ee3-6f44-4a6e-a338-ae124d79498b-kube-api-access-gw75h" (OuterVolumeSpecName: "kube-api-access-gw75h") pod "72bb9ee3-6f44-4a6e-a338-ae124d79498b" (UID: "72bb9ee3-6f44-4a6e-a338-ae124d79498b"). InnerVolumeSpecName "kube-api-access-gw75h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.181240 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72bb9ee3-6f44-4a6e-a338-ae124d79498b-config-data" (OuterVolumeSpecName: "config-data") pod "72bb9ee3-6f44-4a6e-a338-ae124d79498b" (UID: "72bb9ee3-6f44-4a6e-a338-ae124d79498b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.187753 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72bb9ee3-6f44-4a6e-a338-ae124d79498b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72bb9ee3-6f44-4a6e-a338-ae124d79498b" (UID: "72bb9ee3-6f44-4a6e-a338-ae124d79498b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.214074 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.256305 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72bb9ee3-6f44-4a6e-a338-ae124d79498b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.256607 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72bb9ee3-6f44-4a6e-a338-ae124d79498b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.256618 4913 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72bb9ee3-6f44-4a6e-a338-ae124d79498b-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.256627 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw75h\" (UniqueName: \"kubernetes.io/projected/72bb9ee3-6f44-4a6e-a338-ae124d79498b-kube-api-access-gw75h\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.597566 4913 generic.go:334] "Generic (PLEG): container finished" podID="72bb9ee3-6f44-4a6e-a338-ae124d79498b" containerID="94908159d949400adf228c15e33961bba9c3112ebecbb48a478ccfc48cab234f" exitCode=0 Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.597636 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.597644 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72bb9ee3-6f44-4a6e-a338-ae124d79498b","Type":"ContainerDied","Data":"94908159d949400adf228c15e33961bba9c3112ebecbb48a478ccfc48cab234f"} Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.597672 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72bb9ee3-6f44-4a6e-a338-ae124d79498b","Type":"ContainerDied","Data":"9db6c342e2b58e504d1a1ed749ca29f13a141302f95f1e2d968845c690b29e17"} Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.597689 4913 scope.go:117] "RemoveContainer" containerID="94908159d949400adf228c15e33961bba9c3112ebecbb48a478ccfc48cab234f" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.599450 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"122bc201-edae-47f2-a752-818ba02b0dea","Type":"ContainerDied","Data":"066d22631dfb3b92118491ccb38cf880f29f27f41412deb3734be63779a22fac"} Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.599551 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.635390 4913 scope.go:117] "RemoveContainer" containerID="6c61615400e81dab8e72eb7ca03ff7406cb30d4ceaafeeafe6fc0f93ab617ab9" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.635968 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.645206 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.668062 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.686723 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.698417 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:56:35 crc kubenswrapper[4913]: E1001 12:56:35.698880 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bb9ee3-6f44-4a6e-a338-ae124d79498b" containerName="nova-api-api" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.698904 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bb9ee3-6f44-4a6e-a338-ae124d79498b" containerName="nova-api-api" Oct 01 12:56:35 crc kubenswrapper[4913]: E1001 12:56:35.698960 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122bc201-edae-47f2-a752-818ba02b0dea" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.698969 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="122bc201-edae-47f2-a752-818ba02b0dea" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 12:56:35 crc kubenswrapper[4913]: E1001 12:56:35.698982 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72bb9ee3-6f44-4a6e-a338-ae124d79498b" containerName="nova-api-log" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.698992 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bb9ee3-6f44-4a6e-a338-ae124d79498b" containerName="nova-api-log" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.699209 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="122bc201-edae-47f2-a752-818ba02b0dea" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.699231 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bb9ee3-6f44-4a6e-a338-ae124d79498b" containerName="nova-api-api" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.699243 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="72bb9ee3-6f44-4a6e-a338-ae124d79498b" containerName="nova-api-log" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.700022 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.703013 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.703342 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.704005 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.708909 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.711904 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.711961 4913 scope.go:117] "RemoveContainer" containerID="94908159d949400adf228c15e33961bba9c3112ebecbb48a478ccfc48cab234f" Oct 01 12:56:35 crc kubenswrapper[4913]: E1001 12:56:35.716128 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94908159d949400adf228c15e33961bba9c3112ebecbb48a478ccfc48cab234f\": container with ID starting with 94908159d949400adf228c15e33961bba9c3112ebecbb48a478ccfc48cab234f not found: ID does not exist" containerID="94908159d949400adf228c15e33961bba9c3112ebecbb48a478ccfc48cab234f" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.716165 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94908159d949400adf228c15e33961bba9c3112ebecbb48a478ccfc48cab234f"} err="failed to get container status \"94908159d949400adf228c15e33961bba9c3112ebecbb48a478ccfc48cab234f\": rpc error: code = NotFound desc = could not find container \"94908159d949400adf228c15e33961bba9c3112ebecbb48a478ccfc48cab234f\": container with ID starting with 94908159d949400adf228c15e33961bba9c3112ebecbb48a478ccfc48cab234f not found: ID does not exist" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.716206 4913 scope.go:117] "RemoveContainer" containerID="6c61615400e81dab8e72eb7ca03ff7406cb30d4ceaafeeafe6fc0f93ab617ab9" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.722675 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:35 crc kubenswrapper[4913]: E1001 12:56:35.723788 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c61615400e81dab8e72eb7ca03ff7406cb30d4ceaafeeafe6fc0f93ab617ab9\": container with ID starting with 6c61615400e81dab8e72eb7ca03ff7406cb30d4ceaafeeafe6fc0f93ab617ab9 not found: ID does not exist" containerID="6c61615400e81dab8e72eb7ca03ff7406cb30d4ceaafeeafe6fc0f93ab617ab9" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.723837 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c61615400e81dab8e72eb7ca03ff7406cb30d4ceaafeeafe6fc0f93ab617ab9"} err="failed to get container status \"6c61615400e81dab8e72eb7ca03ff7406cb30d4ceaafeeafe6fc0f93ab617ab9\": rpc error: code = NotFound desc = could not find container \"6c61615400e81dab8e72eb7ca03ff7406cb30d4ceaafeeafe6fc0f93ab617ab9\": container with ID starting with 6c61615400e81dab8e72eb7ca03ff7406cb30d4ceaafeeafe6fc0f93ab617ab9 not found: ID does not exist" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.723863 4913 scope.go:117] "RemoveContainer" containerID="3752308e4b5e6451986e149830f5ca336235e0ecbdd182c32a30d28192f1ef97" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.724304 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.728969 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.729471 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.729868 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.729986 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.865241 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndn7d\" (UniqueName: \"kubernetes.io/projected/68897558-9e02-481e-bf66-50dcdfcd8d42-kube-api-access-ndn7d\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.865337 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.865379 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-config-data\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.865402 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.865466 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-public-tls-certs\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.865558 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.865597 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68897558-9e02-481e-bf66-50dcdfcd8d42-logs\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.865711 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-internal-tls-certs\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.865803 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzvld\" (UniqueName: \"kubernetes.io/projected/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-kube-api-access-bzvld\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.865854 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.865885 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.968256 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.968368 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68897558-9e02-481e-bf66-50dcdfcd8d42-logs\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.968490 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-internal-tls-certs\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.968526 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzvld\" (UniqueName: \"kubernetes.io/projected/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-kube-api-access-bzvld\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.968584 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.968616 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.968745 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndn7d\" (UniqueName: \"kubernetes.io/projected/68897558-9e02-481e-bf66-50dcdfcd8d42-kube-api-access-ndn7d\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.968789 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.968821 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-config-data\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.968854 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.968927 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-public-tls-certs\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.970005 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68897558-9e02-481e-bf66-50dcdfcd8d42-logs\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.975968 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.976495 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.976914 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-internal-tls-certs\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.980213 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-public-tls-certs\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.980655 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.980884 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.981846 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.981997 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-config-data\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.987982 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndn7d\" (UniqueName: \"kubernetes.io/projected/68897558-9e02-481e-bf66-50dcdfcd8d42-kube-api-access-ndn7d\") pod \"nova-api-0\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " pod="openstack/nova-api-0" Oct 01 12:56:35 crc kubenswrapper[4913]: I1001 12:56:35.991464 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzvld\" (UniqueName: \"kubernetes.io/projected/2c8c0d2b-3313-4919-9477-42d93dd1dfdc-kube-api-access-bzvld\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c8c0d2b-3313-4919-9477-42d93dd1dfdc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:36 crc kubenswrapper[4913]: I1001 12:56:36.048576 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:36 crc kubenswrapper[4913]: I1001 12:56:36.055844 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:36 crc kubenswrapper[4913]: I1001 12:56:36.516370 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:36 crc kubenswrapper[4913]: W1001 12:56:36.534200 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68897558_9e02_481e_bf66_50dcdfcd8d42.slice/crio-e0aa16b8d31363a4e570c98a8be979bca89cd0d32ddbae0b9c70a1c766758651 WatchSource:0}: Error finding container e0aa16b8d31363a4e570c98a8be979bca89cd0d32ddbae0b9c70a1c766758651: Status 404 returned error can't find the container with id e0aa16b8d31363a4e570c98a8be979bca89cd0d32ddbae0b9c70a1c766758651 Oct 01 12:56:36 crc kubenswrapper[4913]: I1001 12:56:36.595675 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:56:36 crc kubenswrapper[4913]: W1001 12:56:36.602058 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c8c0d2b_3313_4919_9477_42d93dd1dfdc.slice/crio-b9948784029eb0e85f05c2e63b53427b22f32f238337fa43bb0db5a81773b983 WatchSource:0}: Error finding container b9948784029eb0e85f05c2e63b53427b22f32f238337fa43bb0db5a81773b983: Status 404 returned error can't find the container with id b9948784029eb0e85f05c2e63b53427b22f32f238337fa43bb0db5a81773b983 Oct 01 12:56:36 crc kubenswrapper[4913]: I1001 12:56:36.611454 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68897558-9e02-481e-bf66-50dcdfcd8d42","Type":"ContainerStarted","Data":"e0aa16b8d31363a4e570c98a8be979bca89cd0d32ddbae0b9c70a1c766758651"} Oct 01 12:56:36 crc kubenswrapper[4913]: I1001 12:56:36.613332 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32bdf40f-f949-40cf-8416-3a032c521d59","Type":"ContainerStarted","Data":"1d25ae0bccec6ac70640a85460308cfb578e713ec810f1442b3d4ddf5bfb4376"} Oct 01 12:56:36 crc kubenswrapper[4913]: I1001 12:56:36.613450 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32bdf40f-f949-40cf-8416-3a032c521d59","Type":"ContainerStarted","Data":"739279761b1046f87b319836bfdbc90584e977ad485292e668f73b680eac4fb1"} Oct 01 12:56:36 crc kubenswrapper[4913]: I1001 12:56:36.824009 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="122bc201-edae-47f2-a752-818ba02b0dea" path="/var/lib/kubelet/pods/122bc201-edae-47f2-a752-818ba02b0dea/volumes" Oct 01 12:56:36 crc kubenswrapper[4913]: I1001 12:56:36.825706 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72bb9ee3-6f44-4a6e-a338-ae124d79498b" path="/var/lib/kubelet/pods/72bb9ee3-6f44-4a6e-a338-ae124d79498b/volumes" Oct 01 12:56:37 crc kubenswrapper[4913]: I1001 12:56:37.626519 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68897558-9e02-481e-bf66-50dcdfcd8d42","Type":"ContainerStarted","Data":"6a741161e9f0b492f73b793406d87270a3e7d5e0d5224bea9a00767a5e66870c"} Oct 01 12:56:37 crc kubenswrapper[4913]: I1001 12:56:37.626957 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68897558-9e02-481e-bf66-50dcdfcd8d42","Type":"ContainerStarted","Data":"9a66d5db6c423fae1fc9d3a742884693912ced3f57207d75a4045f074f7edaa0"} Oct 01 12:56:37 crc kubenswrapper[4913]: I1001 12:56:37.627646 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2c8c0d2b-3313-4919-9477-42d93dd1dfdc","Type":"ContainerStarted","Data":"8ad7a5a502d1add4db4599b3add72a2fbdab8704a73501c25b5b7127088e2cc4"} Oct 01 12:56:37 crc kubenswrapper[4913]: I1001 12:56:37.627664 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2c8c0d2b-3313-4919-9477-42d93dd1dfdc","Type":"ContainerStarted","Data":"b9948784029eb0e85f05c2e63b53427b22f32f238337fa43bb0db5a81773b983"} Oct 01 12:56:37 crc kubenswrapper[4913]: I1001 12:56:37.629086 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32bdf40f-f949-40cf-8416-3a032c521d59","Type":"ContainerStarted","Data":"ecd6b515d802897b63c90bd65f0970c65293b8d85345fa6c742603b43a56f55b"} Oct 01 12:56:37 crc kubenswrapper[4913]: I1001 12:56:37.655461 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.655439799 podStartE2EDuration="2.655439799s" podCreationTimestamp="2025-10-01 12:56:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:37.650837133 +0000 UTC m=+1129.554312741" watchObservedRunningTime="2025-10-01 12:56:37.655439799 +0000 UTC m=+1129.558915377" Oct 01 12:56:38 crc kubenswrapper[4913]: I1001 12:56:38.640741 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32bdf40f-f949-40cf-8416-3a032c521d59","Type":"ContainerStarted","Data":"e26ec81d006b97f0751099a25aeb00dacbd36dda58452ba98b53767e3c31f188"} Oct 01 12:56:38 crc kubenswrapper[4913]: I1001 12:56:38.851711 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.851688121 podStartE2EDuration="3.851688121s" podCreationTimestamp="2025-10-01 12:56:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:37.678439051 +0000 UTC m=+1129.581914649" watchObservedRunningTime="2025-10-01 12:56:38.851688121 +0000 UTC m=+1130.755163719" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.049458 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.140047 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b454497-drt5d"] Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.140406 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-745b454497-drt5d" podUID="9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973" containerName="dnsmasq-dns" containerID="cri-o://8be06082a90483ac727a581c388cd02ac073187cbf0e5094acda07d40b509407" gracePeriod=10 Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.645709 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.651580 4913 generic.go:334] "Generic (PLEG): container finished" podID="9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973" containerID="8be06082a90483ac727a581c388cd02ac073187cbf0e5094acda07d40b509407" exitCode=0 Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.651613 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b454497-drt5d" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.651631 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b454497-drt5d" event={"ID":"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973","Type":"ContainerDied","Data":"8be06082a90483ac727a581c388cd02ac073187cbf0e5094acda07d40b509407"} Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.652024 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b454497-drt5d" event={"ID":"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973","Type":"ContainerDied","Data":"3503c16e9f3a159158e5215cb3822f052519a4c68123be5ea5a967d37d229ecf"} Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.652057 4913 scope.go:117] "RemoveContainer" containerID="8be06082a90483ac727a581c388cd02ac073187cbf0e5094acda07d40b509407" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.669998 4913 scope.go:117] "RemoveContainer" containerID="9e5175b326378e964c8f4d2bad6ffbc15b77400c42f9d9241aa71f70c34bd512" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.704633 4913 scope.go:117] "RemoveContainer" containerID="8be06082a90483ac727a581c388cd02ac073187cbf0e5094acda07d40b509407" Oct 01 12:56:39 crc kubenswrapper[4913]: E1001 12:56:39.705385 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be06082a90483ac727a581c388cd02ac073187cbf0e5094acda07d40b509407\": container with ID starting with 8be06082a90483ac727a581c388cd02ac073187cbf0e5094acda07d40b509407 not found: ID does not exist" containerID="8be06082a90483ac727a581c388cd02ac073187cbf0e5094acda07d40b509407" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.705613 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be06082a90483ac727a581c388cd02ac073187cbf0e5094acda07d40b509407"} err="failed to get container status \"8be06082a90483ac727a581c388cd02ac073187cbf0e5094acda07d40b509407\": rpc error: code = NotFound desc = could not find container \"8be06082a90483ac727a581c388cd02ac073187cbf0e5094acda07d40b509407\": container with ID starting with 8be06082a90483ac727a581c388cd02ac073187cbf0e5094acda07d40b509407 not found: ID does not exist" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.705718 4913 scope.go:117] "RemoveContainer" containerID="9e5175b326378e964c8f4d2bad6ffbc15b77400c42f9d9241aa71f70c34bd512" Oct 01 12:56:39 crc kubenswrapper[4913]: E1001 12:56:39.706235 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e5175b326378e964c8f4d2bad6ffbc15b77400c42f9d9241aa71f70c34bd512\": container with ID starting with 9e5175b326378e964c8f4d2bad6ffbc15b77400c42f9d9241aa71f70c34bd512 not found: ID does not exist" containerID="9e5175b326378e964c8f4d2bad6ffbc15b77400c42f9d9241aa71f70c34bd512" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.706348 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e5175b326378e964c8f4d2bad6ffbc15b77400c42f9d9241aa71f70c34bd512"} err="failed to get container status \"9e5175b326378e964c8f4d2bad6ffbc15b77400c42f9d9241aa71f70c34bd512\": rpc error: code = NotFound desc = could not find container \"9e5175b326378e964c8f4d2bad6ffbc15b77400c42f9d9241aa71f70c34bd512\": container with ID starting with 9e5175b326378e964c8f4d2bad6ffbc15b77400c42f9d9241aa71f70c34bd512 not found: ID does not exist" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.750104 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-ovsdbserver-sb\") pod \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.750180 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-dns-svc\") pod \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.750211 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-config\") pod \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.750326 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-ovsdbserver-nb\") pod \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.750487 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfm2l\" (UniqueName: \"kubernetes.io/projected/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-kube-api-access-gfm2l\") pod \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\" (UID: \"9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973\") " Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.757751 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-kube-api-access-gfm2l" (OuterVolumeSpecName: "kube-api-access-gfm2l") pod "9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973" (UID: "9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973"). InnerVolumeSpecName "kube-api-access-gfm2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.807814 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-config" (OuterVolumeSpecName: "config") pod "9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973" (UID: "9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.813776 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973" (UID: "9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.814608 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973" (UID: "9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.823560 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973" (UID: "9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.853300 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.853325 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfm2l\" (UniqueName: \"kubernetes.io/projected/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-kube-api-access-gfm2l\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.853334 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.853362 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.853370 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:39 crc kubenswrapper[4913]: I1001 12:56:39.993214 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b454497-drt5d"] Oct 01 12:56:40 crc kubenswrapper[4913]: I1001 12:56:40.004256 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-745b454497-drt5d"] Oct 01 12:56:40 crc kubenswrapper[4913]: I1001 12:56:40.083562 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:56:40 crc kubenswrapper[4913]: I1001 12:56:40.083618 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:56:40 crc kubenswrapper[4913]: I1001 12:56:40.083665 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:56:40 crc kubenswrapper[4913]: I1001 12:56:40.084471 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b8ccdeaa9feae2c057a74d4a7cb5ae3ea008156e58af6b9bb65a4673a0aaa9d4"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:56:40 crc kubenswrapper[4913]: I1001 12:56:40.084538 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://b8ccdeaa9feae2c057a74d4a7cb5ae3ea008156e58af6b9bb65a4673a0aaa9d4" gracePeriod=600 Oct 01 12:56:40 crc kubenswrapper[4913]: I1001 12:56:40.660510 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32bdf40f-f949-40cf-8416-3a032c521d59","Type":"ContainerStarted","Data":"8ba8adfe345f66348211f5072e29facd580bdc6a6c18466c7d0bf1ada230a67e"} Oct 01 12:56:40 crc kubenswrapper[4913]: I1001 12:56:40.660904 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:56:40 crc kubenswrapper[4913]: I1001 12:56:40.664788 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="b8ccdeaa9feae2c057a74d4a7cb5ae3ea008156e58af6b9bb65a4673a0aaa9d4" exitCode=0 Oct 01 12:56:40 crc kubenswrapper[4913]: I1001 12:56:40.664970 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"b8ccdeaa9feae2c057a74d4a7cb5ae3ea008156e58af6b9bb65a4673a0aaa9d4"} Oct 01 12:56:40 crc kubenswrapper[4913]: I1001 12:56:40.665024 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"fd610d0107ae658b14a61d2241d34b3304551116724eb973af1cbc4a77d29ef1"} Oct 01 12:56:40 crc kubenswrapper[4913]: I1001 12:56:40.665059 4913 scope.go:117] "RemoveContainer" containerID="799ca569c504b87f0203003c8051d299d1a44d32ea3031c0c1940d1be3fbaa96" Oct 01 12:56:40 crc kubenswrapper[4913]: I1001 12:56:40.688691 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.537110304 podStartE2EDuration="6.68867464s" podCreationTimestamp="2025-10-01 12:56:34 +0000 UTC" firstStartedPulling="2025-10-01 12:56:35.716522352 +0000 UTC m=+1127.619997930" lastFinishedPulling="2025-10-01 12:56:39.868086688 +0000 UTC m=+1131.771562266" observedRunningTime="2025-10-01 12:56:40.688424894 +0000 UTC m=+1132.591900492" watchObservedRunningTime="2025-10-01 12:56:40.68867464 +0000 UTC m=+1132.592150218" Oct 01 12:56:40 crc kubenswrapper[4913]: I1001 12:56:40.825437 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973" path="/var/lib/kubelet/pods/9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973/volumes" Oct 01 12:56:41 crc kubenswrapper[4913]: I1001 12:56:41.049532 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:46 crc kubenswrapper[4913]: I1001 12:56:46.049126 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:46 crc kubenswrapper[4913]: I1001 12:56:46.056256 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 12:56:46 crc kubenswrapper[4913]: I1001 12:56:46.056305 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 12:56:46 crc kubenswrapper[4913]: I1001 12:56:46.069680 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:46 crc kubenswrapper[4913]: I1001 12:56:46.743572 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:46 crc kubenswrapper[4913]: I1001 12:56:46.934297 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-z4hjt"] Oct 01 12:56:46 crc kubenswrapper[4913]: E1001 12:56:46.934728 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973" containerName="init" Oct 01 12:56:46 crc kubenswrapper[4913]: I1001 12:56:46.934747 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973" containerName="init" Oct 01 12:56:46 crc kubenswrapper[4913]: E1001 12:56:46.934771 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973" containerName="dnsmasq-dns" Oct 01 12:56:46 crc kubenswrapper[4913]: I1001 12:56:46.934780 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973" containerName="dnsmasq-dns" Oct 01 12:56:46 crc kubenswrapper[4913]: I1001 12:56:46.934988 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e51d4e0-c0a4-4a49-8d6c-7f063a3bc973" containerName="dnsmasq-dns" Oct 01 12:56:46 crc kubenswrapper[4913]: I1001 12:56:46.935717 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:46 crc kubenswrapper[4913]: I1001 12:56:46.937892 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 01 12:56:46 crc kubenswrapper[4913]: I1001 12:56:46.938607 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 01 12:56:46 crc kubenswrapper[4913]: I1001 12:56:46.946383 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-z4hjt"] Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.066525 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="68897558-9e02-481e-bf66-50dcdfcd8d42" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.066593 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="68897558-9e02-481e-bf66-50dcdfcd8d42" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.098222 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-config-data\") pod \"nova-cell1-cell-mapping-z4hjt\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.098339 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j8f9\" (UniqueName: \"kubernetes.io/projected/56638b67-fd8f-40b1-85da-09edf48fbd46-kube-api-access-7j8f9\") pod \"nova-cell1-cell-mapping-z4hjt\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.098392 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-scripts\") pod \"nova-cell1-cell-mapping-z4hjt\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.098465 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z4hjt\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.199897 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z4hjt\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.200017 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-config-data\") pod \"nova-cell1-cell-mapping-z4hjt\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.200045 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j8f9\" (UniqueName: \"kubernetes.io/projected/56638b67-fd8f-40b1-85da-09edf48fbd46-kube-api-access-7j8f9\") pod \"nova-cell1-cell-mapping-z4hjt\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.200080 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-scripts\") pod \"nova-cell1-cell-mapping-z4hjt\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.205884 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-config-data\") pod \"nova-cell1-cell-mapping-z4hjt\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.206677 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-scripts\") pod \"nova-cell1-cell-mapping-z4hjt\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.207327 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z4hjt\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.223657 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j8f9\" (UniqueName: \"kubernetes.io/projected/56638b67-fd8f-40b1-85da-09edf48fbd46-kube-api-access-7j8f9\") pod \"nova-cell1-cell-mapping-z4hjt\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.257845 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.729058 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-z4hjt"] Oct 01 12:56:47 crc kubenswrapper[4913]: I1001 12:56:47.756709 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z4hjt" event={"ID":"56638b67-fd8f-40b1-85da-09edf48fbd46","Type":"ContainerStarted","Data":"dc00910b02702e98ffe2ac0c08c2cd51fbb151041285f755efdcd0e881a7438f"} Oct 01 12:56:48 crc kubenswrapper[4913]: I1001 12:56:48.770046 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z4hjt" event={"ID":"56638b67-fd8f-40b1-85da-09edf48fbd46","Type":"ContainerStarted","Data":"af0568664959c58c8e024eaa00731438bdd8392c44c4691fcf4374d7f1a594cf"} Oct 01 12:56:48 crc kubenswrapper[4913]: I1001 12:56:48.796488 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-z4hjt" podStartSLOduration=2.796471843 podStartE2EDuration="2.796471843s" podCreationTimestamp="2025-10-01 12:56:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:48.793576314 +0000 UTC m=+1140.697051912" watchObservedRunningTime="2025-10-01 12:56:48.796471843 +0000 UTC m=+1140.699947421" Oct 01 12:56:52 crc kubenswrapper[4913]: I1001 12:56:52.808986 4913 generic.go:334] "Generic (PLEG): container finished" podID="56638b67-fd8f-40b1-85da-09edf48fbd46" containerID="af0568664959c58c8e024eaa00731438bdd8392c44c4691fcf4374d7f1a594cf" exitCode=0 Oct 01 12:56:52 crc kubenswrapper[4913]: I1001 12:56:52.815313 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z4hjt" event={"ID":"56638b67-fd8f-40b1-85da-09edf48fbd46","Type":"ContainerDied","Data":"af0568664959c58c8e024eaa00731438bdd8392c44c4691fcf4374d7f1a594cf"} Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.174942 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.254818 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j8f9\" (UniqueName: \"kubernetes.io/projected/56638b67-fd8f-40b1-85da-09edf48fbd46-kube-api-access-7j8f9\") pod \"56638b67-fd8f-40b1-85da-09edf48fbd46\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.254877 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-scripts\") pod \"56638b67-fd8f-40b1-85da-09edf48fbd46\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.254921 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-combined-ca-bundle\") pod \"56638b67-fd8f-40b1-85da-09edf48fbd46\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.255015 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-config-data\") pod \"56638b67-fd8f-40b1-85da-09edf48fbd46\" (UID: \"56638b67-fd8f-40b1-85da-09edf48fbd46\") " Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.260944 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-scripts" (OuterVolumeSpecName: "scripts") pod "56638b67-fd8f-40b1-85da-09edf48fbd46" (UID: "56638b67-fd8f-40b1-85da-09edf48fbd46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.261109 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56638b67-fd8f-40b1-85da-09edf48fbd46-kube-api-access-7j8f9" (OuterVolumeSpecName: "kube-api-access-7j8f9") pod "56638b67-fd8f-40b1-85da-09edf48fbd46" (UID: "56638b67-fd8f-40b1-85da-09edf48fbd46"). InnerVolumeSpecName "kube-api-access-7j8f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.283687 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56638b67-fd8f-40b1-85da-09edf48fbd46" (UID: "56638b67-fd8f-40b1-85da-09edf48fbd46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.298158 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-config-data" (OuterVolumeSpecName: "config-data") pod "56638b67-fd8f-40b1-85da-09edf48fbd46" (UID: "56638b67-fd8f-40b1-85da-09edf48fbd46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.357564 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.357601 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j8f9\" (UniqueName: \"kubernetes.io/projected/56638b67-fd8f-40b1-85da-09edf48fbd46-kube-api-access-7j8f9\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.357612 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.357620 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56638b67-fd8f-40b1-85da-09edf48fbd46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.825445 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z4hjt" event={"ID":"56638b67-fd8f-40b1-85da-09edf48fbd46","Type":"ContainerDied","Data":"dc00910b02702e98ffe2ac0c08c2cd51fbb151041285f755efdcd0e881a7438f"} Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.825680 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc00910b02702e98ffe2ac0c08c2cd51fbb151041285f755efdcd0e881a7438f" Oct 01 12:56:54 crc kubenswrapper[4913]: I1001 12:56:54.825513 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z4hjt" Oct 01 12:56:55 crc kubenswrapper[4913]: I1001 12:56:55.019483 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:55 crc kubenswrapper[4913]: I1001 12:56:55.019776 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="741b4306-efcc-4357-af8d-d74daab515ea" containerName="nova-scheduler-scheduler" containerID="cri-o://c1d643f65e3d824697352d51f6ff065839ab2397d310f2c8d7768785957400d4" gracePeriod=30 Oct 01 12:56:55 crc kubenswrapper[4913]: I1001 12:56:55.031445 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:55 crc kubenswrapper[4913]: I1001 12:56:55.031693 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="68897558-9e02-481e-bf66-50dcdfcd8d42" containerName="nova-api-log" containerID="cri-o://9a66d5db6c423fae1fc9d3a742884693912ced3f57207d75a4045f074f7edaa0" gracePeriod=30 Oct 01 12:56:55 crc kubenswrapper[4913]: I1001 12:56:55.031833 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="68897558-9e02-481e-bf66-50dcdfcd8d42" containerName="nova-api-api" containerID="cri-o://6a741161e9f0b492f73b793406d87270a3e7d5e0d5224bea9a00767a5e66870c" gracePeriod=30 Oct 01 12:56:55 crc kubenswrapper[4913]: I1001 12:56:55.044732 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:55 crc kubenswrapper[4913]: I1001 12:56:55.045286 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="57b23d8d-d51d-474a-b84d-644e88b745d5" containerName="nova-metadata-log" containerID="cri-o://ec751f59c943396157f61850f6335f6e4e8c73e9ae7b87089da75c9370d93d7e" gracePeriod=30 Oct 01 12:56:55 crc kubenswrapper[4913]: I1001 12:56:55.045391 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="57b23d8d-d51d-474a-b84d-644e88b745d5" containerName="nova-metadata-metadata" containerID="cri-o://972d38b7594739f52591c153a4e2627805516c29cd9091d91d8e914bc0440179" gracePeriod=30 Oct 01 12:56:55 crc kubenswrapper[4913]: I1001 12:56:55.836051 4913 generic.go:334] "Generic (PLEG): container finished" podID="68897558-9e02-481e-bf66-50dcdfcd8d42" containerID="9a66d5db6c423fae1fc9d3a742884693912ced3f57207d75a4045f074f7edaa0" exitCode=143 Oct 01 12:56:55 crc kubenswrapper[4913]: I1001 12:56:55.836159 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68897558-9e02-481e-bf66-50dcdfcd8d42","Type":"ContainerDied","Data":"9a66d5db6c423fae1fc9d3a742884693912ced3f57207d75a4045f074f7edaa0"} Oct 01 12:56:55 crc kubenswrapper[4913]: I1001 12:56:55.839135 4913 generic.go:334] "Generic (PLEG): container finished" podID="57b23d8d-d51d-474a-b84d-644e88b745d5" containerID="ec751f59c943396157f61850f6335f6e4e8c73e9ae7b87089da75c9370d93d7e" exitCode=143 Oct 01 12:56:55 crc kubenswrapper[4913]: I1001 12:56:55.839183 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"57b23d8d-d51d-474a-b84d-644e88b745d5","Type":"ContainerDied","Data":"ec751f59c943396157f61850f6335f6e4e8c73e9ae7b87089da75c9370d93d7e"} Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.549718 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.620157 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/741b4306-efcc-4357-af8d-d74daab515ea-config-data\") pod \"741b4306-efcc-4357-af8d-d74daab515ea\" (UID: \"741b4306-efcc-4357-af8d-d74daab515ea\") " Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.620213 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741b4306-efcc-4357-af8d-d74daab515ea-combined-ca-bundle\") pod \"741b4306-efcc-4357-af8d-d74daab515ea\" (UID: \"741b4306-efcc-4357-af8d-d74daab515ea\") " Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.620308 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4np4\" (UniqueName: \"kubernetes.io/projected/741b4306-efcc-4357-af8d-d74daab515ea-kube-api-access-q4np4\") pod \"741b4306-efcc-4357-af8d-d74daab515ea\" (UID: \"741b4306-efcc-4357-af8d-d74daab515ea\") " Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.632545 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/741b4306-efcc-4357-af8d-d74daab515ea-kube-api-access-q4np4" (OuterVolumeSpecName: "kube-api-access-q4np4") pod "741b4306-efcc-4357-af8d-d74daab515ea" (UID: "741b4306-efcc-4357-af8d-d74daab515ea"). InnerVolumeSpecName "kube-api-access-q4np4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.655734 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/741b4306-efcc-4357-af8d-d74daab515ea-config-data" (OuterVolumeSpecName: "config-data") pod "741b4306-efcc-4357-af8d-d74daab515ea" (UID: "741b4306-efcc-4357-af8d-d74daab515ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.656199 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/741b4306-efcc-4357-af8d-d74daab515ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "741b4306-efcc-4357-af8d-d74daab515ea" (UID: "741b4306-efcc-4357-af8d-d74daab515ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.722145 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/741b4306-efcc-4357-af8d-d74daab515ea-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.722172 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741b4306-efcc-4357-af8d-d74daab515ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.722183 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4np4\" (UniqueName: \"kubernetes.io/projected/741b4306-efcc-4357-af8d-d74daab515ea-kube-api-access-q4np4\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.858134 4913 generic.go:334] "Generic (PLEG): container finished" podID="741b4306-efcc-4357-af8d-d74daab515ea" containerID="c1d643f65e3d824697352d51f6ff065839ab2397d310f2c8d7768785957400d4" exitCode=0 Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.858195 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.858225 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"741b4306-efcc-4357-af8d-d74daab515ea","Type":"ContainerDied","Data":"c1d643f65e3d824697352d51f6ff065839ab2397d310f2c8d7768785957400d4"} Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.858317 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"741b4306-efcc-4357-af8d-d74daab515ea","Type":"ContainerDied","Data":"66e9d902b5cde517349af0b3ec824513125f5f6b256bb1208c92e449f734b584"} Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.858352 4913 scope.go:117] "RemoveContainer" containerID="c1d643f65e3d824697352d51f6ff065839ab2397d310f2c8d7768785957400d4" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.888518 4913 scope.go:117] "RemoveContainer" containerID="c1d643f65e3d824697352d51f6ff065839ab2397d310f2c8d7768785957400d4" Oct 01 12:56:57 crc kubenswrapper[4913]: E1001 12:56:57.889023 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1d643f65e3d824697352d51f6ff065839ab2397d310f2c8d7768785957400d4\": container with ID starting with c1d643f65e3d824697352d51f6ff065839ab2397d310f2c8d7768785957400d4 not found: ID does not exist" containerID="c1d643f65e3d824697352d51f6ff065839ab2397d310f2c8d7768785957400d4" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.889053 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1d643f65e3d824697352d51f6ff065839ab2397d310f2c8d7768785957400d4"} err="failed to get container status \"c1d643f65e3d824697352d51f6ff065839ab2397d310f2c8d7768785957400d4\": rpc error: code = NotFound desc = could not find container \"c1d643f65e3d824697352d51f6ff065839ab2397d310f2c8d7768785957400d4\": container with ID starting with c1d643f65e3d824697352d51f6ff065839ab2397d310f2c8d7768785957400d4 not found: ID does not exist" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.907029 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.929205 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.954628 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:57 crc kubenswrapper[4913]: E1001 12:56:57.955083 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56638b67-fd8f-40b1-85da-09edf48fbd46" containerName="nova-manage" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.955118 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="56638b67-fd8f-40b1-85da-09edf48fbd46" containerName="nova-manage" Oct 01 12:56:57 crc kubenswrapper[4913]: E1001 12:56:57.955145 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="741b4306-efcc-4357-af8d-d74daab515ea" containerName="nova-scheduler-scheduler" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.955152 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="741b4306-efcc-4357-af8d-d74daab515ea" containerName="nova-scheduler-scheduler" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.955395 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="56638b67-fd8f-40b1-85da-09edf48fbd46" containerName="nova-manage" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.955424 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="741b4306-efcc-4357-af8d-d74daab515ea" containerName="nova-scheduler-scheduler" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.956130 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.957971 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 12:56:57 crc kubenswrapper[4913]: I1001 12:56:57.962809 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.027433 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xvgf\" (UniqueName: \"kubernetes.io/projected/fb00793e-8b71-47c4-8bce-1197d68a8b4b-kube-api-access-5xvgf\") pod \"nova-scheduler-0\" (UID: \"fb00793e-8b71-47c4-8bce-1197d68a8b4b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.027502 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb00793e-8b71-47c4-8bce-1197d68a8b4b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb00793e-8b71-47c4-8bce-1197d68a8b4b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.027579 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb00793e-8b71-47c4-8bce-1197d68a8b4b-config-data\") pod \"nova-scheduler-0\" (UID: \"fb00793e-8b71-47c4-8bce-1197d68a8b4b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.129527 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xvgf\" (UniqueName: \"kubernetes.io/projected/fb00793e-8b71-47c4-8bce-1197d68a8b4b-kube-api-access-5xvgf\") pod \"nova-scheduler-0\" (UID: \"fb00793e-8b71-47c4-8bce-1197d68a8b4b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.129580 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb00793e-8b71-47c4-8bce-1197d68a8b4b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb00793e-8b71-47c4-8bce-1197d68a8b4b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.129622 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb00793e-8b71-47c4-8bce-1197d68a8b4b-config-data\") pod \"nova-scheduler-0\" (UID: \"fb00793e-8b71-47c4-8bce-1197d68a8b4b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.133922 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb00793e-8b71-47c4-8bce-1197d68a8b4b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb00793e-8b71-47c4-8bce-1197d68a8b4b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.135447 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb00793e-8b71-47c4-8bce-1197d68a8b4b-config-data\") pod \"nova-scheduler-0\" (UID: \"fb00793e-8b71-47c4-8bce-1197d68a8b4b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.145674 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xvgf\" (UniqueName: \"kubernetes.io/projected/fb00793e-8b71-47c4-8bce-1197d68a8b4b-kube-api-access-5xvgf\") pod \"nova-scheduler-0\" (UID: \"fb00793e-8b71-47c4-8bce-1197d68a8b4b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.195725 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="57b23d8d-d51d-474a-b84d-644e88b745d5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.180:8775/\": read tcp 10.217.0.2:35488->10.217.0.180:8775: read: connection reset by peer" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.195742 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="57b23d8d-d51d-474a-b84d-644e88b745d5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.180:8775/\": read tcp 10.217.0.2:35496->10.217.0.180:8775: read: connection reset by peer" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.283177 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.714130 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.721805 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.822080 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="741b4306-efcc-4357-af8d-d74daab515ea" path="/var/lib/kubelet/pods/741b4306-efcc-4357-af8d-d74daab515ea/volumes" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.828023 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.843228 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57b23d8d-d51d-474a-b84d-644e88b745d5-logs\") pod \"57b23d8d-d51d-474a-b84d-644e88b745d5\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.843365 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-public-tls-certs\") pod \"68897558-9e02-481e-bf66-50dcdfcd8d42\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.843429 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-internal-tls-certs\") pod \"68897558-9e02-481e-bf66-50dcdfcd8d42\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.843457 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-config-data\") pod \"68897558-9e02-481e-bf66-50dcdfcd8d42\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.843516 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndn7d\" (UniqueName: \"kubernetes.io/projected/68897558-9e02-481e-bf66-50dcdfcd8d42-kube-api-access-ndn7d\") pod \"68897558-9e02-481e-bf66-50dcdfcd8d42\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.843573 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-combined-ca-bundle\") pod \"68897558-9e02-481e-bf66-50dcdfcd8d42\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.843592 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-nova-metadata-tls-certs\") pod \"57b23d8d-d51d-474a-b84d-644e88b745d5\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.843620 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-str8w\" (UniqueName: \"kubernetes.io/projected/57b23d8d-d51d-474a-b84d-644e88b745d5-kube-api-access-str8w\") pod \"57b23d8d-d51d-474a-b84d-644e88b745d5\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.843651 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-config-data\") pod \"57b23d8d-d51d-474a-b84d-644e88b745d5\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.843670 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68897558-9e02-481e-bf66-50dcdfcd8d42-logs\") pod \"68897558-9e02-481e-bf66-50dcdfcd8d42\" (UID: \"68897558-9e02-481e-bf66-50dcdfcd8d42\") " Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.843691 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-combined-ca-bundle\") pod \"57b23d8d-d51d-474a-b84d-644e88b745d5\" (UID: \"57b23d8d-d51d-474a-b84d-644e88b745d5\") " Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.843989 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57b23d8d-d51d-474a-b84d-644e88b745d5-logs" (OuterVolumeSpecName: "logs") pod "57b23d8d-d51d-474a-b84d-644e88b745d5" (UID: "57b23d8d-d51d-474a-b84d-644e88b745d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.844461 4913 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57b23d8d-d51d-474a-b84d-644e88b745d5-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.844484 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68897558-9e02-481e-bf66-50dcdfcd8d42-logs" (OuterVolumeSpecName: "logs") pod "68897558-9e02-481e-bf66-50dcdfcd8d42" (UID: "68897558-9e02-481e-bf66-50dcdfcd8d42"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.847321 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68897558-9e02-481e-bf66-50dcdfcd8d42-kube-api-access-ndn7d" (OuterVolumeSpecName: "kube-api-access-ndn7d") pod "68897558-9e02-481e-bf66-50dcdfcd8d42" (UID: "68897558-9e02-481e-bf66-50dcdfcd8d42"). InnerVolumeSpecName "kube-api-access-ndn7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.848146 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b23d8d-d51d-474a-b84d-644e88b745d5-kube-api-access-str8w" (OuterVolumeSpecName: "kube-api-access-str8w") pod "57b23d8d-d51d-474a-b84d-644e88b745d5" (UID: "57b23d8d-d51d-474a-b84d-644e88b745d5"). InnerVolumeSpecName "kube-api-access-str8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.868640 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68897558-9e02-481e-bf66-50dcdfcd8d42" (UID: "68897558-9e02-481e-bf66-50dcdfcd8d42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.869347 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb00793e-8b71-47c4-8bce-1197d68a8b4b","Type":"ContainerStarted","Data":"67b6b4699c9d0f4d9aeb4336980f3f1f3589716058800b69c1e4357d4d296fc4"} Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.872538 4913 generic.go:334] "Generic (PLEG): container finished" podID="57b23d8d-d51d-474a-b84d-644e88b745d5" containerID="972d38b7594739f52591c153a4e2627805516c29cd9091d91d8e914bc0440179" exitCode=0 Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.872596 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.872611 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"57b23d8d-d51d-474a-b84d-644e88b745d5","Type":"ContainerDied","Data":"972d38b7594739f52591c153a4e2627805516c29cd9091d91d8e914bc0440179"} Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.872719 4913 scope.go:117] "RemoveContainer" containerID="972d38b7594739f52591c153a4e2627805516c29cd9091d91d8e914bc0440179" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.873040 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"57b23d8d-d51d-474a-b84d-644e88b745d5","Type":"ContainerDied","Data":"90c4fb77eaefe411f55038dfd49791a5c91262cbfd776e2b210f81cef6a313dc"} Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.875596 4913 generic.go:334] "Generic (PLEG): container finished" podID="68897558-9e02-481e-bf66-50dcdfcd8d42" containerID="6a741161e9f0b492f73b793406d87270a3e7d5e0d5224bea9a00767a5e66870c" exitCode=0 Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.875625 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68897558-9e02-481e-bf66-50dcdfcd8d42","Type":"ContainerDied","Data":"6a741161e9f0b492f73b793406d87270a3e7d5e0d5224bea9a00767a5e66870c"} Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.875640 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68897558-9e02-481e-bf66-50dcdfcd8d42","Type":"ContainerDied","Data":"e0aa16b8d31363a4e570c98a8be979bca89cd0d32ddbae0b9c70a1c766758651"} Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.875894 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.876840 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57b23d8d-d51d-474a-b84d-644e88b745d5" (UID: "57b23d8d-d51d-474a-b84d-644e88b745d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.881200 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-config-data" (OuterVolumeSpecName: "config-data") pod "68897558-9e02-481e-bf66-50dcdfcd8d42" (UID: "68897558-9e02-481e-bf66-50dcdfcd8d42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.881443 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-config-data" (OuterVolumeSpecName: "config-data") pod "57b23d8d-d51d-474a-b84d-644e88b745d5" (UID: "57b23d8d-d51d-474a-b84d-644e88b745d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.894009 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "68897558-9e02-481e-bf66-50dcdfcd8d42" (UID: "68897558-9e02-481e-bf66-50dcdfcd8d42"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.895865 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "57b23d8d-d51d-474a-b84d-644e88b745d5" (UID: "57b23d8d-d51d-474a-b84d-644e88b745d5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.900242 4913 scope.go:117] "RemoveContainer" containerID="ec751f59c943396157f61850f6335f6e4e8c73e9ae7b87089da75c9370d93d7e" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.904196 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "68897558-9e02-481e-bf66-50dcdfcd8d42" (UID: "68897558-9e02-481e-bf66-50dcdfcd8d42"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.922975 4913 scope.go:117] "RemoveContainer" containerID="972d38b7594739f52591c153a4e2627805516c29cd9091d91d8e914bc0440179" Oct 01 12:56:58 crc kubenswrapper[4913]: E1001 12:56:58.923453 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972d38b7594739f52591c153a4e2627805516c29cd9091d91d8e914bc0440179\": container with ID starting with 972d38b7594739f52591c153a4e2627805516c29cd9091d91d8e914bc0440179 not found: ID does not exist" containerID="972d38b7594739f52591c153a4e2627805516c29cd9091d91d8e914bc0440179" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.923477 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972d38b7594739f52591c153a4e2627805516c29cd9091d91d8e914bc0440179"} err="failed to get container status \"972d38b7594739f52591c153a4e2627805516c29cd9091d91d8e914bc0440179\": rpc error: code = NotFound desc = could not find container \"972d38b7594739f52591c153a4e2627805516c29cd9091d91d8e914bc0440179\": container with ID starting with 972d38b7594739f52591c153a4e2627805516c29cd9091d91d8e914bc0440179 not found: ID does not exist" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.923498 4913 scope.go:117] "RemoveContainer" containerID="ec751f59c943396157f61850f6335f6e4e8c73e9ae7b87089da75c9370d93d7e" Oct 01 12:56:58 crc kubenswrapper[4913]: E1001 12:56:58.923740 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec751f59c943396157f61850f6335f6e4e8c73e9ae7b87089da75c9370d93d7e\": container with ID starting with ec751f59c943396157f61850f6335f6e4e8c73e9ae7b87089da75c9370d93d7e not found: ID does not exist" containerID="ec751f59c943396157f61850f6335f6e4e8c73e9ae7b87089da75c9370d93d7e" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.923759 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec751f59c943396157f61850f6335f6e4e8c73e9ae7b87089da75c9370d93d7e"} err="failed to get container status \"ec751f59c943396157f61850f6335f6e4e8c73e9ae7b87089da75c9370d93d7e\": rpc error: code = NotFound desc = could not find container \"ec751f59c943396157f61850f6335f6e4e8c73e9ae7b87089da75c9370d93d7e\": container with ID starting with ec751f59c943396157f61850f6335f6e4e8c73e9ae7b87089da75c9370d93d7e not found: ID does not exist" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.923770 4913 scope.go:117] "RemoveContainer" containerID="6a741161e9f0b492f73b793406d87270a3e7d5e0d5224bea9a00767a5e66870c" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.946562 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.946592 4913 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68897558-9e02-481e-bf66-50dcdfcd8d42-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.946603 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.946613 4913 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.946622 4913 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.946631 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.946640 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndn7d\" (UniqueName: \"kubernetes.io/projected/68897558-9e02-481e-bf66-50dcdfcd8d42-kube-api-access-ndn7d\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.946650 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68897558-9e02-481e-bf66-50dcdfcd8d42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.946659 4913 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/57b23d8d-d51d-474a-b84d-644e88b745d5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.946667 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-str8w\" (UniqueName: \"kubernetes.io/projected/57b23d8d-d51d-474a-b84d-644e88b745d5-kube-api-access-str8w\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.952682 4913 scope.go:117] "RemoveContainer" containerID="9a66d5db6c423fae1fc9d3a742884693912ced3f57207d75a4045f074f7edaa0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.981421 4913 scope.go:117] "RemoveContainer" containerID="6a741161e9f0b492f73b793406d87270a3e7d5e0d5224bea9a00767a5e66870c" Oct 01 12:56:58 crc kubenswrapper[4913]: E1001 12:56:58.983394 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a741161e9f0b492f73b793406d87270a3e7d5e0d5224bea9a00767a5e66870c\": container with ID starting with 6a741161e9f0b492f73b793406d87270a3e7d5e0d5224bea9a00767a5e66870c not found: ID does not exist" containerID="6a741161e9f0b492f73b793406d87270a3e7d5e0d5224bea9a00767a5e66870c" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.983426 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a741161e9f0b492f73b793406d87270a3e7d5e0d5224bea9a00767a5e66870c"} err="failed to get container status \"6a741161e9f0b492f73b793406d87270a3e7d5e0d5224bea9a00767a5e66870c\": rpc error: code = NotFound desc = could not find container \"6a741161e9f0b492f73b793406d87270a3e7d5e0d5224bea9a00767a5e66870c\": container with ID starting with 6a741161e9f0b492f73b793406d87270a3e7d5e0d5224bea9a00767a5e66870c not found: ID does not exist" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.983446 4913 scope.go:117] "RemoveContainer" containerID="9a66d5db6c423fae1fc9d3a742884693912ced3f57207d75a4045f074f7edaa0" Oct 01 12:56:58 crc kubenswrapper[4913]: E1001 12:56:58.983865 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a66d5db6c423fae1fc9d3a742884693912ced3f57207d75a4045f074f7edaa0\": container with ID starting with 9a66d5db6c423fae1fc9d3a742884693912ced3f57207d75a4045f074f7edaa0 not found: ID does not exist" containerID="9a66d5db6c423fae1fc9d3a742884693912ced3f57207d75a4045f074f7edaa0" Oct 01 12:56:58 crc kubenswrapper[4913]: I1001 12:56:58.983941 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a66d5db6c423fae1fc9d3a742884693912ced3f57207d75a4045f074f7edaa0"} err="failed to get container status \"9a66d5db6c423fae1fc9d3a742884693912ced3f57207d75a4045f074f7edaa0\": rpc error: code = NotFound desc = could not find container \"9a66d5db6c423fae1fc9d3a742884693912ced3f57207d75a4045f074f7edaa0\": container with ID starting with 9a66d5db6c423fae1fc9d3a742884693912ced3f57207d75a4045f074f7edaa0 not found: ID does not exist" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.207044 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.214287 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.224508 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.233900 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.261409 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:59 crc kubenswrapper[4913]: E1001 12:56:59.261797 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b23d8d-d51d-474a-b84d-644e88b745d5" containerName="nova-metadata-log" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.261814 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b23d8d-d51d-474a-b84d-644e88b745d5" containerName="nova-metadata-log" Oct 01 12:56:59 crc kubenswrapper[4913]: E1001 12:56:59.261837 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68897558-9e02-481e-bf66-50dcdfcd8d42" containerName="nova-api-api" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.261845 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="68897558-9e02-481e-bf66-50dcdfcd8d42" containerName="nova-api-api" Oct 01 12:56:59 crc kubenswrapper[4913]: E1001 12:56:59.261854 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b23d8d-d51d-474a-b84d-644e88b745d5" containerName="nova-metadata-metadata" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.261860 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b23d8d-d51d-474a-b84d-644e88b745d5" containerName="nova-metadata-metadata" Oct 01 12:56:59 crc kubenswrapper[4913]: E1001 12:56:59.261866 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68897558-9e02-481e-bf66-50dcdfcd8d42" containerName="nova-api-log" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.261872 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="68897558-9e02-481e-bf66-50dcdfcd8d42" containerName="nova-api-log" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.262041 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="68897558-9e02-481e-bf66-50dcdfcd8d42" containerName="nova-api-log" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.262059 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="68897558-9e02-481e-bf66-50dcdfcd8d42" containerName="nova-api-api" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.262070 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b23d8d-d51d-474a-b84d-644e88b745d5" containerName="nova-metadata-log" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.262079 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b23d8d-d51d-474a-b84d-644e88b745d5" containerName="nova-metadata-metadata" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.265049 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.267119 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.268909 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.271418 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.281824 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.283948 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.287218 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.288541 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.288834 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.289069 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.357992 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fql8k\" (UniqueName: \"kubernetes.io/projected/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-kube-api-access-fql8k\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.358228 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-config-data\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.358376 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.358471 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-public-tls-certs\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.358577 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-logs\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.358646 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-config-data\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.358729 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.358818 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.358929 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-564hc\" (UniqueName: \"kubernetes.io/projected/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-kube-api-access-564hc\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.359031 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-logs\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.359117 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.463728 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.463788 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-564hc\" (UniqueName: \"kubernetes.io/projected/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-kube-api-access-564hc\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.463840 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-logs\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.463881 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.463904 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fql8k\" (UniqueName: \"kubernetes.io/projected/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-kube-api-access-fql8k\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.463925 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-config-data\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.463942 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.463957 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-public-tls-certs\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.463986 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-logs\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.464004 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-config-data\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.464035 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.464730 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-logs\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.464823 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-logs\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.474003 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.474144 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.479983 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-config-data\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.495150 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.498421 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fql8k\" (UniqueName: \"kubernetes.io/projected/bd947f3e-094f-4a5e-ac65-f24ae595ffdb-kube-api-access-fql8k\") pod \"nova-metadata-0\" (UID: \"bd947f3e-094f-4a5e-ac65-f24ae595ffdb\") " pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.499154 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-config-data\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.499625 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-public-tls-certs\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.500187 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.511868 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-564hc\" (UniqueName: \"kubernetes.io/projected/5c193299-cf4e-4cd3-8ec0-6bba16872aa6-kube-api-access-564hc\") pod \"nova-api-0\" (UID: \"5c193299-cf4e-4cd3-8ec0-6bba16872aa6\") " pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.722578 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.728759 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.890778 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb00793e-8b71-47c4-8bce-1197d68a8b4b","Type":"ContainerStarted","Data":"35a3324ea45baaf358f6d128b2a82bf54dd1bfa6110703ea2fe9aea8d2652c22"} Oct 01 12:56:59 crc kubenswrapper[4913]: I1001 12:56:59.922081 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.922064012 podStartE2EDuration="2.922064012s" podCreationTimestamp="2025-10-01 12:56:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:59.91510233 +0000 UTC m=+1151.818577918" watchObservedRunningTime="2025-10-01 12:56:59.922064012 +0000 UTC m=+1151.825539580" Oct 01 12:57:00 crc kubenswrapper[4913]: I1001 12:57:00.234403 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:57:00 crc kubenswrapper[4913]: I1001 12:57:00.285038 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:57:00 crc kubenswrapper[4913]: W1001 12:57:00.304381 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd947f3e_094f_4a5e_ac65_f24ae595ffdb.slice/crio-4ec5257e8ad5fe1fbcb4856e61a963b836926b64ea797afa7d4f9430cc24d2f4 WatchSource:0}: Error finding container 4ec5257e8ad5fe1fbcb4856e61a963b836926b64ea797afa7d4f9430cc24d2f4: Status 404 returned error can't find the container with id 4ec5257e8ad5fe1fbcb4856e61a963b836926b64ea797afa7d4f9430cc24d2f4 Oct 01 12:57:00 crc kubenswrapper[4913]: I1001 12:57:00.828565 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b23d8d-d51d-474a-b84d-644e88b745d5" path="/var/lib/kubelet/pods/57b23d8d-d51d-474a-b84d-644e88b745d5/volumes" Oct 01 12:57:00 crc kubenswrapper[4913]: I1001 12:57:00.830598 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68897558-9e02-481e-bf66-50dcdfcd8d42" path="/var/lib/kubelet/pods/68897558-9e02-481e-bf66-50dcdfcd8d42/volumes" Oct 01 12:57:00 crc kubenswrapper[4913]: I1001 12:57:00.912172 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c193299-cf4e-4cd3-8ec0-6bba16872aa6","Type":"ContainerStarted","Data":"5ef602cbcc532dbf25a6c987fac3f8fd7b028fc06e4728e0358b7d82203d30e7"} Oct 01 12:57:00 crc kubenswrapper[4913]: I1001 12:57:00.912238 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c193299-cf4e-4cd3-8ec0-6bba16872aa6","Type":"ContainerStarted","Data":"a2049191c1e973881602b1fc0f433d12dab74e482a416eb7857d7ef6cef33215"} Oct 01 12:57:00 crc kubenswrapper[4913]: I1001 12:57:00.912253 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c193299-cf4e-4cd3-8ec0-6bba16872aa6","Type":"ContainerStarted","Data":"0371aa0952a43f04677ed2870ffb6e981afa928176eac4273a1a0fa4939d4dee"} Oct 01 12:57:00 crc kubenswrapper[4913]: I1001 12:57:00.914537 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd947f3e-094f-4a5e-ac65-f24ae595ffdb","Type":"ContainerStarted","Data":"64de3d8bc33490103114ddade8c34d2a4fb47eed336d24650c2eb79d06a96bd7"} Oct 01 12:57:00 crc kubenswrapper[4913]: I1001 12:57:00.914570 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd947f3e-094f-4a5e-ac65-f24ae595ffdb","Type":"ContainerStarted","Data":"2de439eab705061e62daeae60a441cedd040dd61658b63e6cb6681e5c557faa8"} Oct 01 12:57:00 crc kubenswrapper[4913]: I1001 12:57:00.914583 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd947f3e-094f-4a5e-ac65-f24ae595ffdb","Type":"ContainerStarted","Data":"4ec5257e8ad5fe1fbcb4856e61a963b836926b64ea797afa7d4f9430cc24d2f4"} Oct 01 12:57:00 crc kubenswrapper[4913]: I1001 12:57:00.935120 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.935100176 podStartE2EDuration="1.935100176s" podCreationTimestamp="2025-10-01 12:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:57:00.933852862 +0000 UTC m=+1152.837328460" watchObservedRunningTime="2025-10-01 12:57:00.935100176 +0000 UTC m=+1152.838575754" Oct 01 12:57:00 crc kubenswrapper[4913]: I1001 12:57:00.959454 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.959434294 podStartE2EDuration="1.959434294s" podCreationTimestamp="2025-10-01 12:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:57:00.959007732 +0000 UTC m=+1152.862483330" watchObservedRunningTime="2025-10-01 12:57:00.959434294 +0000 UTC m=+1152.862909892" Oct 01 12:57:03 crc kubenswrapper[4913]: I1001 12:57:03.283503 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 12:57:03 crc kubenswrapper[4913]: I1001 12:57:03.449297 4913 scope.go:117] "RemoveContainer" containerID="be3e025bcc328e9f1265019ef5df83d23c096cd41a61ae5214c2b142b25f2fc8" Oct 01 12:57:03 crc kubenswrapper[4913]: I1001 12:57:03.471830 4913 scope.go:117] "RemoveContainer" containerID="b7a0f45b78b53275c37aa5752e2926a43f34ea4c0dedfe1ebb2724f134e96506" Oct 01 12:57:03 crc kubenswrapper[4913]: I1001 12:57:03.542975 4913 scope.go:117] "RemoveContainer" containerID="35b6808ded3a75f0a14804f514dfc68a2668ab2513f8c14da6d19f50b25d5b3b" Oct 01 12:57:04 crc kubenswrapper[4913]: I1001 12:57:04.723330 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 12:57:04 crc kubenswrapper[4913]: I1001 12:57:04.724623 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 12:57:05 crc kubenswrapper[4913]: I1001 12:57:05.228104 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 12:57:08 crc kubenswrapper[4913]: I1001 12:57:08.284182 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 12:57:08 crc kubenswrapper[4913]: I1001 12:57:08.317664 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 12:57:09 crc kubenswrapper[4913]: I1001 12:57:09.022747 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 12:57:09 crc kubenswrapper[4913]: I1001 12:57:09.722697 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 12:57:09 crc kubenswrapper[4913]: I1001 12:57:09.724627 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 12:57:09 crc kubenswrapper[4913]: I1001 12:57:09.730141 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 12:57:09 crc kubenswrapper[4913]: I1001 12:57:09.730379 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 12:57:10 crc kubenswrapper[4913]: I1001 12:57:10.747411 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bd947f3e-094f-4a5e-ac65-f24ae595ffdb" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 12:57:10 crc kubenswrapper[4913]: I1001 12:57:10.747454 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5c193299-cf4e-4cd3-8ec0-6bba16872aa6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 12:57:10 crc kubenswrapper[4913]: I1001 12:57:10.747529 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bd947f3e-094f-4a5e-ac65-f24ae595ffdb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 12:57:10 crc kubenswrapper[4913]: I1001 12:57:10.747557 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5c193299-cf4e-4cd3-8ec0-6bba16872aa6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.194:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 12:57:19 crc kubenswrapper[4913]: I1001 12:57:19.730468 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 12:57:19 crc kubenswrapper[4913]: I1001 12:57:19.731972 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 12:57:19 crc kubenswrapper[4913]: I1001 12:57:19.739903 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 12:57:19 crc kubenswrapper[4913]: I1001 12:57:19.740521 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 12:57:19 crc kubenswrapper[4913]: I1001 12:57:19.741112 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 12:57:19 crc kubenswrapper[4913]: I1001 12:57:19.744755 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 12:57:19 crc kubenswrapper[4913]: I1001 12:57:19.748996 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 12:57:20 crc kubenswrapper[4913]: I1001 12:57:20.092175 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 12:57:20 crc kubenswrapper[4913]: I1001 12:57:20.098549 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 12:57:20 crc kubenswrapper[4913]: I1001 12:57:20.099508 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 12:57:28 crc kubenswrapper[4913]: I1001 12:57:28.303742 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:57:29 crc kubenswrapper[4913]: I1001 12:57:29.549193 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:57:32 crc kubenswrapper[4913]: I1001 12:57:32.926645 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="19a7d9a6-e81a-483a-8408-6784ded67834" containerName="rabbitmq" containerID="cri-o://17fd7e042b77989a88964c948508ab83e998d19bc322214f5087728d5df3fcaa" gracePeriod=604796 Oct 01 12:57:33 crc kubenswrapper[4913]: I1001 12:57:33.577151 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="eafb0f7f-ea12-4d4f-9097-5923b9345bc0" containerName="rabbitmq" containerID="cri-o://0c0d803652573be842be268d202df7cf1b234411e3f3748bb90b65cd3d43d003" gracePeriod=604796 Oct 01 12:57:37 crc kubenswrapper[4913]: I1001 12:57:37.827490 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="19a7d9a6-e81a-483a-8408-6784ded67834" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Oct 01 12:57:38 crc kubenswrapper[4913]: I1001 12:57:38.120554 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="eafb0f7f-ea12-4d4f-9097-5923b9345bc0" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.269380 4913 generic.go:334] "Generic (PLEG): container finished" podID="19a7d9a6-e81a-483a-8408-6784ded67834" containerID="17fd7e042b77989a88964c948508ab83e998d19bc322214f5087728d5df3fcaa" exitCode=0 Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.269470 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19a7d9a6-e81a-483a-8408-6784ded67834","Type":"ContainerDied","Data":"17fd7e042b77989a88964c948508ab83e998d19bc322214f5087728d5df3fcaa"} Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.515522 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.703860 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19a7d9a6-e81a-483a-8408-6784ded67834-erlang-cookie-secret\") pod \"19a7d9a6-e81a-483a-8408-6784ded67834\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.704324 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-plugins-conf\") pod \"19a7d9a6-e81a-483a-8408-6784ded67834\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.704354 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-erlang-cookie\") pod \"19a7d9a6-e81a-483a-8408-6784ded67834\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.704393 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-config-data\") pod \"19a7d9a6-e81a-483a-8408-6784ded67834\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.704417 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-plugins\") pod \"19a7d9a6-e81a-483a-8408-6784ded67834\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.704435 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19a7d9a6-e81a-483a-8408-6784ded67834-pod-info\") pod \"19a7d9a6-e81a-483a-8408-6784ded67834\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.704458 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"19a7d9a6-e81a-483a-8408-6784ded67834\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.704482 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-server-conf\") pod \"19a7d9a6-e81a-483a-8408-6784ded67834\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.704501 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfhhk\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-kube-api-access-zfhhk\") pod \"19a7d9a6-e81a-483a-8408-6784ded67834\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.704562 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-tls\") pod \"19a7d9a6-e81a-483a-8408-6784ded67834\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.704585 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-confd\") pod \"19a7d9a6-e81a-483a-8408-6784ded67834\" (UID: \"19a7d9a6-e81a-483a-8408-6784ded67834\") " Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.708877 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "19a7d9a6-e81a-483a-8408-6784ded67834" (UID: "19a7d9a6-e81a-483a-8408-6784ded67834"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.709066 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "19a7d9a6-e81a-483a-8408-6784ded67834" (UID: "19a7d9a6-e81a-483a-8408-6784ded67834"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.709188 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "19a7d9a6-e81a-483a-8408-6784ded67834" (UID: "19a7d9a6-e81a-483a-8408-6784ded67834"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.720438 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/19a7d9a6-e81a-483a-8408-6784ded67834-pod-info" (OuterVolumeSpecName: "pod-info") pod "19a7d9a6-e81a-483a-8408-6784ded67834" (UID: "19a7d9a6-e81a-483a-8408-6784ded67834"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.731917 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "19a7d9a6-e81a-483a-8408-6784ded67834" (UID: "19a7d9a6-e81a-483a-8408-6784ded67834"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.732474 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "19a7d9a6-e81a-483a-8408-6784ded67834" (UID: "19a7d9a6-e81a-483a-8408-6784ded67834"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.736312 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19a7d9a6-e81a-483a-8408-6784ded67834-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "19a7d9a6-e81a-483a-8408-6784ded67834" (UID: "19a7d9a6-e81a-483a-8408-6784ded67834"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.740843 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-kube-api-access-zfhhk" (OuterVolumeSpecName: "kube-api-access-zfhhk") pod "19a7d9a6-e81a-483a-8408-6784ded67834" (UID: "19a7d9a6-e81a-483a-8408-6784ded67834"). InnerVolumeSpecName "kube-api-access-zfhhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.741681 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-config-data" (OuterVolumeSpecName: "config-data") pod "19a7d9a6-e81a-483a-8408-6784ded67834" (UID: "19a7d9a6-e81a-483a-8408-6784ded67834"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.806452 4913 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.806494 4913 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.806509 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.806522 4913 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.806533 4913 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19a7d9a6-e81a-483a-8408-6784ded67834-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.806558 4913 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.806571 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfhhk\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-kube-api-access-zfhhk\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.806583 4913 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.806595 4913 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19a7d9a6-e81a-483a-8408-6784ded67834-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.822457 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-server-conf" (OuterVolumeSpecName: "server-conf") pod "19a7d9a6-e81a-483a-8408-6784ded67834" (UID: "19a7d9a6-e81a-483a-8408-6784ded67834"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.840622 4913 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.907712 4913 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.907740 4913 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19a7d9a6-e81a-483a-8408-6784ded67834-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:39 crc kubenswrapper[4913]: I1001 12:57:39.909430 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "19a7d9a6-e81a-483a-8408-6784ded67834" (UID: "19a7d9a6-e81a-483a-8408-6784ded67834"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.010011 4913 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19a7d9a6-e81a-483a-8408-6784ded67834-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.073870 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.111370 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.111421 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-tls\") pod \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.111451 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-erlang-cookie\") pod \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.111493 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-erlang-cookie-secret\") pod \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.111511 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-config-data\") pod \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.111542 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-pod-info\") pod \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.111567 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-confd\") pod \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.111599 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vctsw\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-kube-api-access-vctsw\") pod \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.111630 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-plugins-conf\") pod \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.111655 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-server-conf\") pod \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.111677 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-plugins\") pod \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\" (UID: \"eafb0f7f-ea12-4d4f-9097-5923b9345bc0\") " Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.112361 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "eafb0f7f-ea12-4d4f-9097-5923b9345bc0" (UID: "eafb0f7f-ea12-4d4f-9097-5923b9345bc0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.112767 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "eafb0f7f-ea12-4d4f-9097-5923b9345bc0" (UID: "eafb0f7f-ea12-4d4f-9097-5923b9345bc0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.116347 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-pod-info" (OuterVolumeSpecName: "pod-info") pod "eafb0f7f-ea12-4d4f-9097-5923b9345bc0" (UID: "eafb0f7f-ea12-4d4f-9097-5923b9345bc0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.116424 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "eafb0f7f-ea12-4d4f-9097-5923b9345bc0" (UID: "eafb0f7f-ea12-4d4f-9097-5923b9345bc0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.120450 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-kube-api-access-vctsw" (OuterVolumeSpecName: "kube-api-access-vctsw") pod "eafb0f7f-ea12-4d4f-9097-5923b9345bc0" (UID: "eafb0f7f-ea12-4d4f-9097-5923b9345bc0"). InnerVolumeSpecName "kube-api-access-vctsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.120752 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "eafb0f7f-ea12-4d4f-9097-5923b9345bc0" (UID: "eafb0f7f-ea12-4d4f-9097-5923b9345bc0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.121992 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "eafb0f7f-ea12-4d4f-9097-5923b9345bc0" (UID: "eafb0f7f-ea12-4d4f-9097-5923b9345bc0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.126489 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "eafb0f7f-ea12-4d4f-9097-5923b9345bc0" (UID: "eafb0f7f-ea12-4d4f-9097-5923b9345bc0"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.159008 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-config-data" (OuterVolumeSpecName: "config-data") pod "eafb0f7f-ea12-4d4f-9097-5923b9345bc0" (UID: "eafb0f7f-ea12-4d4f-9097-5923b9345bc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.179896 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-server-conf" (OuterVolumeSpecName: "server-conf") pod "eafb0f7f-ea12-4d4f-9097-5923b9345bc0" (UID: "eafb0f7f-ea12-4d4f-9097-5923b9345bc0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.214130 4913 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.214162 4913 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.214171 4913 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.215064 4913 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.215084 4913 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.215095 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.215104 4913 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.215114 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vctsw\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-kube-api-access-vctsw\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.215124 4913 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.215133 4913 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.236441 4913 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.237261 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "eafb0f7f-ea12-4d4f-9097-5923b9345bc0" (UID: "eafb0f7f-ea12-4d4f-9097-5923b9345bc0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.286041 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"19a7d9a6-e81a-483a-8408-6784ded67834","Type":"ContainerDied","Data":"4a117030bf9941b790d08e2ca7167093d342b11941e43bc4cce0ddd8fe4beb5d"} Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.286099 4913 scope.go:117] "RemoveContainer" containerID="17fd7e042b77989a88964c948508ab83e998d19bc322214f5087728d5df3fcaa" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.286098 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.289349 4913 generic.go:334] "Generic (PLEG): container finished" podID="eafb0f7f-ea12-4d4f-9097-5923b9345bc0" containerID="0c0d803652573be842be268d202df7cf1b234411e3f3748bb90b65cd3d43d003" exitCode=0 Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.289386 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eafb0f7f-ea12-4d4f-9097-5923b9345bc0","Type":"ContainerDied","Data":"0c0d803652573be842be268d202df7cf1b234411e3f3748bb90b65cd3d43d003"} Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.289411 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eafb0f7f-ea12-4d4f-9097-5923b9345bc0","Type":"ContainerDied","Data":"37514c9b2f9fbd5fa6b35cb513748fb615f5ec8e3d02462e9f8f1425b1c97f4e"} Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.289467 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.311930 4913 scope.go:117] "RemoveContainer" containerID="b702e6d96a91056bec1b26da179fba714390228d163abda83dd7909687a79615" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.316173 4913 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.316197 4913 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eafb0f7f-ea12-4d4f-9097-5923b9345bc0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.344419 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.353107 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.361901 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.366755 4913 scope.go:117] "RemoveContainer" containerID="0c0d803652573be842be268d202df7cf1b234411e3f3748bb90b65cd3d43d003" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.373953 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.387694 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:57:40 crc kubenswrapper[4913]: E1001 12:57:40.388094 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eafb0f7f-ea12-4d4f-9097-5923b9345bc0" containerName="rabbitmq" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.388113 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafb0f7f-ea12-4d4f-9097-5923b9345bc0" containerName="rabbitmq" Oct 01 12:57:40 crc kubenswrapper[4913]: E1001 12:57:40.388129 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19a7d9a6-e81a-483a-8408-6784ded67834" containerName="setup-container" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.388135 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="19a7d9a6-e81a-483a-8408-6784ded67834" containerName="setup-container" Oct 01 12:57:40 crc kubenswrapper[4913]: E1001 12:57:40.388147 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eafb0f7f-ea12-4d4f-9097-5923b9345bc0" containerName="setup-container" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.388152 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafb0f7f-ea12-4d4f-9097-5923b9345bc0" containerName="setup-container" Oct 01 12:57:40 crc kubenswrapper[4913]: E1001 12:57:40.388166 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19a7d9a6-e81a-483a-8408-6784ded67834" containerName="rabbitmq" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.388171 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="19a7d9a6-e81a-483a-8408-6784ded67834" containerName="rabbitmq" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.388390 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="eafb0f7f-ea12-4d4f-9097-5923b9345bc0" containerName="rabbitmq" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.388408 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="19a7d9a6-e81a-483a-8408-6784ded67834" containerName="rabbitmq" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.389306 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.394232 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.394333 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.394372 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-65sv7" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.394391 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.394496 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.394501 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.394780 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.398253 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.402741 4913 scope.go:117] "RemoveContainer" containerID="0af59876f943b701332c73d88665e99a191224ea85bade54965220f15ee27b27" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.429392 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.443369 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7d60c488-fc39-4b3b-bd78-839f6975bcfa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.443501 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.443564 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7d60c488-fc39-4b3b-bd78-839f6975bcfa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.443625 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7d60c488-fc39-4b3b-bd78-839f6975bcfa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.443660 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7d60c488-fc39-4b3b-bd78-839f6975bcfa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.443723 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7d60c488-fc39-4b3b-bd78-839f6975bcfa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.443744 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7d60c488-fc39-4b3b-bd78-839f6975bcfa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.443826 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq82k\" (UniqueName: \"kubernetes.io/projected/7d60c488-fc39-4b3b-bd78-839f6975bcfa-kube-api-access-nq82k\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.443872 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7d60c488-fc39-4b3b-bd78-839f6975bcfa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.443945 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7d60c488-fc39-4b3b-bd78-839f6975bcfa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.444096 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7d60c488-fc39-4b3b-bd78-839f6975bcfa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.474450 4913 scope.go:117] "RemoveContainer" containerID="0c0d803652573be842be268d202df7cf1b234411e3f3748bb90b65cd3d43d003" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.491833 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.492065 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.495877 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.496031 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.496196 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.496336 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.496592 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.500613 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 12:57:40 crc kubenswrapper[4913]: E1001 12:57:40.501660 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c0d803652573be842be268d202df7cf1b234411e3f3748bb90b65cd3d43d003\": container with ID starting with 0c0d803652573be842be268d202df7cf1b234411e3f3748bb90b65cd3d43d003 not found: ID does not exist" containerID="0c0d803652573be842be268d202df7cf1b234411e3f3748bb90b65cd3d43d003" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.501704 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c0d803652573be842be268d202df7cf1b234411e3f3748bb90b65cd3d43d003"} err="failed to get container status \"0c0d803652573be842be268d202df7cf1b234411e3f3748bb90b65cd3d43d003\": rpc error: code = NotFound desc = could not find container \"0c0d803652573be842be268d202df7cf1b234411e3f3748bb90b65cd3d43d003\": container with ID starting with 0c0d803652573be842be268d202df7cf1b234411e3f3748bb90b65cd3d43d003 not found: ID does not exist" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.501729 4913 scope.go:117] "RemoveContainer" containerID="0af59876f943b701332c73d88665e99a191224ea85bade54965220f15ee27b27" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.501738 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jlsgv" Oct 01 12:57:40 crc kubenswrapper[4913]: E1001 12:57:40.506358 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0af59876f943b701332c73d88665e99a191224ea85bade54965220f15ee27b27\": container with ID starting with 0af59876f943b701332c73d88665e99a191224ea85bade54965220f15ee27b27 not found: ID does not exist" containerID="0af59876f943b701332c73d88665e99a191224ea85bade54965220f15ee27b27" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.506521 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af59876f943b701332c73d88665e99a191224ea85bade54965220f15ee27b27"} err="failed to get container status \"0af59876f943b701332c73d88665e99a191224ea85bade54965220f15ee27b27\": rpc error: code = NotFound desc = could not find container \"0af59876f943b701332c73d88665e99a191224ea85bade54965220f15ee27b27\": container with ID starting with 0af59876f943b701332c73d88665e99a191224ea85bade54965220f15ee27b27 not found: ID does not exist" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588461 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7d60c488-fc39-4b3b-bd78-839f6975bcfa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588524 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588555 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588581 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588606 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588629 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588666 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7d60c488-fc39-4b3b-bd78-839f6975bcfa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588707 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7d60c488-fc39-4b3b-bd78-839f6975bcfa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588732 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588760 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7d60c488-fc39-4b3b-bd78-839f6975bcfa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588793 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588818 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7d60c488-fc39-4b3b-bd78-839f6975bcfa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588840 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588862 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7d60c488-fc39-4b3b-bd78-839f6975bcfa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588917 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq82k\" (UniqueName: \"kubernetes.io/projected/7d60c488-fc39-4b3b-bd78-839f6975bcfa-kube-api-access-nq82k\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588950 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppt7x\" (UniqueName: \"kubernetes.io/projected/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-kube-api-access-ppt7x\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.588976 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7d60c488-fc39-4b3b-bd78-839f6975bcfa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.589003 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.589045 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7d60c488-fc39-4b3b-bd78-839f6975bcfa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.589063 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-config-data\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.589088 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.589124 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7d60c488-fc39-4b3b-bd78-839f6975bcfa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.589671 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.590199 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7d60c488-fc39-4b3b-bd78-839f6975bcfa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.590538 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7d60c488-fc39-4b3b-bd78-839f6975bcfa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.590538 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7d60c488-fc39-4b3b-bd78-839f6975bcfa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.591155 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7d60c488-fc39-4b3b-bd78-839f6975bcfa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.591691 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7d60c488-fc39-4b3b-bd78-839f6975bcfa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.607247 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7d60c488-fc39-4b3b-bd78-839f6975bcfa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.607316 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7d60c488-fc39-4b3b-bd78-839f6975bcfa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.607757 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7d60c488-fc39-4b3b-bd78-839f6975bcfa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.607768 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7d60c488-fc39-4b3b-bd78-839f6975bcfa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.643636 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.647696 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq82k\" (UniqueName: \"kubernetes.io/projected/7d60c488-fc39-4b3b-bd78-839f6975bcfa-kube-api-access-nq82k\") pod \"rabbitmq-cell1-server-0\" (UID: \"7d60c488-fc39-4b3b-bd78-839f6975bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.691209 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppt7x\" (UniqueName: \"kubernetes.io/projected/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-kube-api-access-ppt7x\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.691282 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.691316 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-config-data\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.691333 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.691375 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.691391 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.691409 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.691425 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.691466 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.691493 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.691510 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.692142 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.692566 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.692880 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.692959 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-config-data\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.693212 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.693912 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.697750 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.702852 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.703482 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.710599 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.714907 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppt7x\" (UniqueName: \"kubernetes.io/projected/3e71ec0f-d0d7-40a1-b83c-20f0dc177473-kube-api-access-ppt7x\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.730663 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.751963 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"3e71ec0f-d0d7-40a1-b83c-20f0dc177473\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.817735 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19a7d9a6-e81a-483a-8408-6784ded67834" path="/var/lib/kubelet/pods/19a7d9a6-e81a-483a-8408-6784ded67834/volumes" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.818479 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eafb0f7f-ea12-4d4f-9097-5923b9345bc0" path="/var/lib/kubelet/pods/eafb0f7f-ea12-4d4f-9097-5923b9345bc0/volumes" Oct 01 12:57:40 crc kubenswrapper[4913]: I1001 12:57:40.819233 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4913]: I1001 12:57:41.232620 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:57:41 crc kubenswrapper[4913]: I1001 12:57:41.299196 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7d60c488-fc39-4b3b-bd78-839f6975bcfa","Type":"ContainerStarted","Data":"d09905b4bd7289db361fca8dc9076630871673267c8dea4c1c65c4c8455759ff"} Oct 01 12:57:41 crc kubenswrapper[4913]: W1001 12:57:41.322214 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e71ec0f_d0d7_40a1_b83c_20f0dc177473.slice/crio-22e869daa7ce2651ccc6a6977064669053b40c6a25560ffed4109da7563e017e WatchSource:0}: Error finding container 22e869daa7ce2651ccc6a6977064669053b40c6a25560ffed4109da7563e017e: Status 404 returned error can't find the container with id 22e869daa7ce2651ccc6a6977064669053b40c6a25560ffed4109da7563e017e Oct 01 12:57:41 crc kubenswrapper[4913]: I1001 12:57:41.324997 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:57:42 crc kubenswrapper[4913]: I1001 12:57:42.310549 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3e71ec0f-d0d7-40a1-b83c-20f0dc177473","Type":"ContainerStarted","Data":"22e869daa7ce2651ccc6a6977064669053b40c6a25560ffed4109da7563e017e"} Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.039588 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-547c766f9-k6lsk"] Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.041734 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.044147 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.057296 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-547c766f9-k6lsk"] Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.137059 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-openstack-edpm-ipam\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.137108 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-ovsdbserver-nb\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.137156 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-ovsdbserver-sb\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.137329 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-config\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.137391 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq5tf\" (UniqueName: \"kubernetes.io/projected/0a62c4b5-7585-428e-9498-525fa6c885b3-kube-api-access-rq5tf\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.137440 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-dns-svc\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.238897 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-config\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.238951 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq5tf\" (UniqueName: \"kubernetes.io/projected/0a62c4b5-7585-428e-9498-525fa6c885b3-kube-api-access-rq5tf\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.238978 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-dns-svc\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.239056 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-openstack-edpm-ipam\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.239080 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-ovsdbserver-nb\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.239127 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-ovsdbserver-sb\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.240145 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-openstack-edpm-ipam\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.240186 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-ovsdbserver-sb\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.240220 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-ovsdbserver-nb\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.240297 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-dns-svc\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.240737 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-config\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.255449 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq5tf\" (UniqueName: \"kubernetes.io/projected/0a62c4b5-7585-428e-9498-525fa6c885b3-kube-api-access-rq5tf\") pod \"dnsmasq-dns-547c766f9-k6lsk\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.324000 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7d60c488-fc39-4b3b-bd78-839f6975bcfa","Type":"ContainerStarted","Data":"6a220f379eac2f421a71c9956ab193ce8cae2ad26826c4cea85de58c24070f6d"} Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.326476 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3e71ec0f-d0d7-40a1-b83c-20f0dc177473","Type":"ContainerStarted","Data":"d90b93733858a179a2b8520682f822c3228c70a850586dd69fa8232cdc3a3ad1"} Oct 01 12:57:43 crc kubenswrapper[4913]: I1001 12:57:43.360220 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:44 crc kubenswrapper[4913]: I1001 12:57:44.000587 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-547c766f9-k6lsk"] Oct 01 12:57:44 crc kubenswrapper[4913]: W1001 12:57:44.010405 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a62c4b5_7585_428e_9498_525fa6c885b3.slice/crio-45e23eeb6115cc165e1f566f9435df75ef92f3465737e3dbd951d632e16df712 WatchSource:0}: Error finding container 45e23eeb6115cc165e1f566f9435df75ef92f3465737e3dbd951d632e16df712: Status 404 returned error can't find the container with id 45e23eeb6115cc165e1f566f9435df75ef92f3465737e3dbd951d632e16df712 Oct 01 12:57:44 crc kubenswrapper[4913]: I1001 12:57:44.335844 4913 generic.go:334] "Generic (PLEG): container finished" podID="0a62c4b5-7585-428e-9498-525fa6c885b3" containerID="0d92b2e059b2c9103d3abff4499a070e7ee0f0de6f2feb6dd4d226e9be87899c" exitCode=0 Oct 01 12:57:44 crc kubenswrapper[4913]: I1001 12:57:44.335884 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547c766f9-k6lsk" event={"ID":"0a62c4b5-7585-428e-9498-525fa6c885b3","Type":"ContainerDied","Data":"0d92b2e059b2c9103d3abff4499a070e7ee0f0de6f2feb6dd4d226e9be87899c"} Oct 01 12:57:44 crc kubenswrapper[4913]: I1001 12:57:44.336207 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547c766f9-k6lsk" event={"ID":"0a62c4b5-7585-428e-9498-525fa6c885b3","Type":"ContainerStarted","Data":"45e23eeb6115cc165e1f566f9435df75ef92f3465737e3dbd951d632e16df712"} Oct 01 12:57:45 crc kubenswrapper[4913]: I1001 12:57:45.345896 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547c766f9-k6lsk" event={"ID":"0a62c4b5-7585-428e-9498-525fa6c885b3","Type":"ContainerStarted","Data":"1aa89047c1b66f2f29187681b086dd97e4f59a31c368c9adcd9284eddf49c682"} Oct 01 12:57:45 crc kubenswrapper[4913]: I1001 12:57:45.346259 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:45 crc kubenswrapper[4913]: I1001 12:57:45.370503 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-547c766f9-k6lsk" podStartSLOduration=2.37048219 podStartE2EDuration="2.37048219s" podCreationTimestamp="2025-10-01 12:57:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:57:45.364708971 +0000 UTC m=+1197.268184569" watchObservedRunningTime="2025-10-01 12:57:45.37048219 +0000 UTC m=+1197.273957768" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.362556 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.417421 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-657bf774d5-fwnrk"] Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.417658 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" podUID="96c0278e-106e-4985-a279-d8c89f141b15" containerName="dnsmasq-dns" containerID="cri-o://198be700b826f0735785340d588b10cbe8173a162552e432ee8e68edf655ee98" gracePeriod=10 Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.598377 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84968f68f7-g5z7n"] Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.600339 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.615424 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84968f68f7-g5z7n"] Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.745673 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-openstack-edpm-ipam\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.745735 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgjsp\" (UniqueName: \"kubernetes.io/projected/0e9d019d-bd66-4323-843b-e52e0efa0771-kube-api-access-wgjsp\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.745797 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-ovsdbserver-nb\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.745828 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-ovsdbserver-sb\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.745900 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-dns-svc\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.745987 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-config\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.847761 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-ovsdbserver-nb\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.847805 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-ovsdbserver-sb\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.847843 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-dns-svc\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.847865 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-config\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.847962 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-openstack-edpm-ipam\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.847985 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgjsp\" (UniqueName: \"kubernetes.io/projected/0e9d019d-bd66-4323-843b-e52e0efa0771-kube-api-access-wgjsp\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.848805 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-ovsdbserver-nb\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.848880 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-dns-svc\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.849006 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-ovsdbserver-sb\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.849444 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-openstack-edpm-ipam\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.849515 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-config\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.870996 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgjsp\" (UniqueName: \"kubernetes.io/projected/0e9d019d-bd66-4323-843b-e52e0efa0771-kube-api-access-wgjsp\") pod \"dnsmasq-dns-84968f68f7-g5z7n\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.920962 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:53 crc kubenswrapper[4913]: I1001 12:57:53.926947 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.052004 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-ovsdbserver-sb\") pod \"96c0278e-106e-4985-a279-d8c89f141b15\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.052052 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-config\") pod \"96c0278e-106e-4985-a279-d8c89f141b15\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.052099 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gx4s\" (UniqueName: \"kubernetes.io/projected/96c0278e-106e-4985-a279-d8c89f141b15-kube-api-access-9gx4s\") pod \"96c0278e-106e-4985-a279-d8c89f141b15\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.052177 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-dns-svc\") pod \"96c0278e-106e-4985-a279-d8c89f141b15\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.052244 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-ovsdbserver-nb\") pod \"96c0278e-106e-4985-a279-d8c89f141b15\" (UID: \"96c0278e-106e-4985-a279-d8c89f141b15\") " Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.084317 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c0278e-106e-4985-a279-d8c89f141b15-kube-api-access-9gx4s" (OuterVolumeSpecName: "kube-api-access-9gx4s") pod "96c0278e-106e-4985-a279-d8c89f141b15" (UID: "96c0278e-106e-4985-a279-d8c89f141b15"). InnerVolumeSpecName "kube-api-access-9gx4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.107890 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "96c0278e-106e-4985-a279-d8c89f141b15" (UID: "96c0278e-106e-4985-a279-d8c89f141b15"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.109883 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "96c0278e-106e-4985-a279-d8c89f141b15" (UID: "96c0278e-106e-4985-a279-d8c89f141b15"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.116895 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "96c0278e-106e-4985-a279-d8c89f141b15" (UID: "96c0278e-106e-4985-a279-d8c89f141b15"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.124108 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-config" (OuterVolumeSpecName: "config") pod "96c0278e-106e-4985-a279-d8c89f141b15" (UID: "96c0278e-106e-4985-a279-d8c89f141b15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.155054 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.155122 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.155137 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.155148 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gx4s\" (UniqueName: \"kubernetes.io/projected/96c0278e-106e-4985-a279-d8c89f141b15-kube-api-access-9gx4s\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.155157 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96c0278e-106e-4985-a279-d8c89f141b15-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.425070 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84968f68f7-g5z7n"] Oct 01 12:57:54 crc kubenswrapper[4913]: W1001 12:57:54.433461 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e9d019d_bd66_4323_843b_e52e0efa0771.slice/crio-2d402ae383b84b724d1fd212af3666c27da5b9f5c5a6c893e9e3b42ed1bd1c78 WatchSource:0}: Error finding container 2d402ae383b84b724d1fd212af3666c27da5b9f5c5a6c893e9e3b42ed1bd1c78: Status 404 returned error can't find the container with id 2d402ae383b84b724d1fd212af3666c27da5b9f5c5a6c893e9e3b42ed1bd1c78 Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.446137 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" event={"ID":"0e9d019d-bd66-4323-843b-e52e0efa0771","Type":"ContainerStarted","Data":"2d402ae383b84b724d1fd212af3666c27da5b9f5c5a6c893e9e3b42ed1bd1c78"} Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.447811 4913 generic.go:334] "Generic (PLEG): container finished" podID="96c0278e-106e-4985-a279-d8c89f141b15" containerID="198be700b826f0735785340d588b10cbe8173a162552e432ee8e68edf655ee98" exitCode=0 Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.447849 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" event={"ID":"96c0278e-106e-4985-a279-d8c89f141b15","Type":"ContainerDied","Data":"198be700b826f0735785340d588b10cbe8173a162552e432ee8e68edf655ee98"} Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.447891 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" event={"ID":"96c0278e-106e-4985-a279-d8c89f141b15","Type":"ContainerDied","Data":"7d33d0fa1caaf165d91fa4ba1a79325255eb507e6f8d310743393d0a8273b1da"} Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.447909 4913 scope.go:117] "RemoveContainer" containerID="198be700b826f0735785340d588b10cbe8173a162552e432ee8e68edf655ee98" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.448073 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-657bf774d5-fwnrk" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.470142 4913 scope.go:117] "RemoveContainer" containerID="894f6aedb67c103c854a2574a1c310305046efa5b006824c729877921ac82e0c" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.485640 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-657bf774d5-fwnrk"] Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.493655 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-657bf774d5-fwnrk"] Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.506340 4913 scope.go:117] "RemoveContainer" containerID="198be700b826f0735785340d588b10cbe8173a162552e432ee8e68edf655ee98" Oct 01 12:57:54 crc kubenswrapper[4913]: E1001 12:57:54.506920 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"198be700b826f0735785340d588b10cbe8173a162552e432ee8e68edf655ee98\": container with ID starting with 198be700b826f0735785340d588b10cbe8173a162552e432ee8e68edf655ee98 not found: ID does not exist" containerID="198be700b826f0735785340d588b10cbe8173a162552e432ee8e68edf655ee98" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.507018 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"198be700b826f0735785340d588b10cbe8173a162552e432ee8e68edf655ee98"} err="failed to get container status \"198be700b826f0735785340d588b10cbe8173a162552e432ee8e68edf655ee98\": rpc error: code = NotFound desc = could not find container \"198be700b826f0735785340d588b10cbe8173a162552e432ee8e68edf655ee98\": container with ID starting with 198be700b826f0735785340d588b10cbe8173a162552e432ee8e68edf655ee98 not found: ID does not exist" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.507097 4913 scope.go:117] "RemoveContainer" containerID="894f6aedb67c103c854a2574a1c310305046efa5b006824c729877921ac82e0c" Oct 01 12:57:54 crc kubenswrapper[4913]: E1001 12:57:54.507421 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"894f6aedb67c103c854a2574a1c310305046efa5b006824c729877921ac82e0c\": container with ID starting with 894f6aedb67c103c854a2574a1c310305046efa5b006824c729877921ac82e0c not found: ID does not exist" containerID="894f6aedb67c103c854a2574a1c310305046efa5b006824c729877921ac82e0c" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.507465 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"894f6aedb67c103c854a2574a1c310305046efa5b006824c729877921ac82e0c"} err="failed to get container status \"894f6aedb67c103c854a2574a1c310305046efa5b006824c729877921ac82e0c\": rpc error: code = NotFound desc = could not find container \"894f6aedb67c103c854a2574a1c310305046efa5b006824c729877921ac82e0c\": container with ID starting with 894f6aedb67c103c854a2574a1c310305046efa5b006824c729877921ac82e0c not found: ID does not exist" Oct 01 12:57:54 crc kubenswrapper[4913]: I1001 12:57:54.819040 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96c0278e-106e-4985-a279-d8c89f141b15" path="/var/lib/kubelet/pods/96c0278e-106e-4985-a279-d8c89f141b15/volumes" Oct 01 12:57:55 crc kubenswrapper[4913]: I1001 12:57:55.456606 4913 generic.go:334] "Generic (PLEG): container finished" podID="0e9d019d-bd66-4323-843b-e52e0efa0771" containerID="b59f85f7259eaf2c83e87fd08c188e17d8465b07c926bb259ebb8e94cd972429" exitCode=0 Oct 01 12:57:55 crc kubenswrapper[4913]: I1001 12:57:55.456684 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" event={"ID":"0e9d019d-bd66-4323-843b-e52e0efa0771","Type":"ContainerDied","Data":"b59f85f7259eaf2c83e87fd08c188e17d8465b07c926bb259ebb8e94cd972429"} Oct 01 12:57:56 crc kubenswrapper[4913]: I1001 12:57:56.469687 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" event={"ID":"0e9d019d-bd66-4323-843b-e52e0efa0771","Type":"ContainerStarted","Data":"b102f83d21ae82772b304d7ba9cfde3ed5facfc0a16075ca87c13a6ee21f2071"} Oct 01 12:57:56 crc kubenswrapper[4913]: I1001 12:57:56.470003 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:57:56 crc kubenswrapper[4913]: I1001 12:57:56.492223 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" podStartSLOduration=3.492206382 podStartE2EDuration="3.492206382s" podCreationTimestamp="2025-10-01 12:57:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:57:56.486652019 +0000 UTC m=+1208.390127617" watchObservedRunningTime="2025-10-01 12:57:56.492206382 +0000 UTC m=+1208.395681960" Oct 01 12:58:03 crc kubenswrapper[4913]: I1001 12:58:03.923499 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 12:58:03 crc kubenswrapper[4913]: I1001 12:58:03.994747 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-547c766f9-k6lsk"] Oct 01 12:58:03 crc kubenswrapper[4913]: I1001 12:58:03.995515 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-547c766f9-k6lsk" podUID="0a62c4b5-7585-428e-9498-525fa6c885b3" containerName="dnsmasq-dns" containerID="cri-o://1aa89047c1b66f2f29187681b086dd97e4f59a31c368c9adcd9284eddf49c682" gracePeriod=10 Oct 01 12:58:04 crc kubenswrapper[4913]: I1001 12:58:04.537944 4913 generic.go:334] "Generic (PLEG): container finished" podID="0a62c4b5-7585-428e-9498-525fa6c885b3" containerID="1aa89047c1b66f2f29187681b086dd97e4f59a31c368c9adcd9284eddf49c682" exitCode=0 Oct 01 12:58:04 crc kubenswrapper[4913]: I1001 12:58:04.537993 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547c766f9-k6lsk" event={"ID":"0a62c4b5-7585-428e-9498-525fa6c885b3","Type":"ContainerDied","Data":"1aa89047c1b66f2f29187681b086dd97e4f59a31c368c9adcd9284eddf49c682"} Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.108693 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.254948 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-ovsdbserver-nb\") pod \"0a62c4b5-7585-428e-9498-525fa6c885b3\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.255009 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-dns-svc\") pod \"0a62c4b5-7585-428e-9498-525fa6c885b3\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.255183 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-config\") pod \"0a62c4b5-7585-428e-9498-525fa6c885b3\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.255214 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-openstack-edpm-ipam\") pod \"0a62c4b5-7585-428e-9498-525fa6c885b3\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.255243 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-ovsdbserver-sb\") pod \"0a62c4b5-7585-428e-9498-525fa6c885b3\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.255303 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq5tf\" (UniqueName: \"kubernetes.io/projected/0a62c4b5-7585-428e-9498-525fa6c885b3-kube-api-access-rq5tf\") pod \"0a62c4b5-7585-428e-9498-525fa6c885b3\" (UID: \"0a62c4b5-7585-428e-9498-525fa6c885b3\") " Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.262341 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a62c4b5-7585-428e-9498-525fa6c885b3-kube-api-access-rq5tf" (OuterVolumeSpecName: "kube-api-access-rq5tf") pod "0a62c4b5-7585-428e-9498-525fa6c885b3" (UID: "0a62c4b5-7585-428e-9498-525fa6c885b3"). InnerVolumeSpecName "kube-api-access-rq5tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.306768 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-config" (OuterVolumeSpecName: "config") pod "0a62c4b5-7585-428e-9498-525fa6c885b3" (UID: "0a62c4b5-7585-428e-9498-525fa6c885b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.309606 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0a62c4b5-7585-428e-9498-525fa6c885b3" (UID: "0a62c4b5-7585-428e-9498-525fa6c885b3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.314224 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0a62c4b5-7585-428e-9498-525fa6c885b3" (UID: "0a62c4b5-7585-428e-9498-525fa6c885b3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.320610 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0a62c4b5-7585-428e-9498-525fa6c885b3" (UID: "0a62c4b5-7585-428e-9498-525fa6c885b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.320753 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "0a62c4b5-7585-428e-9498-525fa6c885b3" (UID: "0a62c4b5-7585-428e-9498-525fa6c885b3"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.358125 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.358162 4913 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.358178 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.358228 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq5tf\" (UniqueName: \"kubernetes.io/projected/0a62c4b5-7585-428e-9498-525fa6c885b3-kube-api-access-rq5tf\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.358239 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.358246 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a62c4b5-7585-428e-9498-525fa6c885b3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.548482 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547c766f9-k6lsk" event={"ID":"0a62c4b5-7585-428e-9498-525fa6c885b3","Type":"ContainerDied","Data":"45e23eeb6115cc165e1f566f9435df75ef92f3465737e3dbd951d632e16df712"} Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.548540 4913 scope.go:117] "RemoveContainer" containerID="1aa89047c1b66f2f29187681b086dd97e4f59a31c368c9adcd9284eddf49c682" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.548565 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547c766f9-k6lsk" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.574729 4913 scope.go:117] "RemoveContainer" containerID="0d92b2e059b2c9103d3abff4499a070e7ee0f0de6f2feb6dd4d226e9be87899c" Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.595663 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-547c766f9-k6lsk"] Oct 01 12:58:05 crc kubenswrapper[4913]: I1001 12:58:05.607093 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-547c766f9-k6lsk"] Oct 01 12:58:06 crc kubenswrapper[4913]: I1001 12:58:06.819146 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a62c4b5-7585-428e-9498-525fa6c885b3" path="/var/lib/kubelet/pods/0a62c4b5-7585-428e-9498-525fa6c885b3/volumes" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.276942 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z"] Oct 01 12:58:14 crc kubenswrapper[4913]: E1001 12:58:14.277772 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c0278e-106e-4985-a279-d8c89f141b15" containerName="dnsmasq-dns" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.277786 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c0278e-106e-4985-a279-d8c89f141b15" containerName="dnsmasq-dns" Oct 01 12:58:14 crc kubenswrapper[4913]: E1001 12:58:14.277796 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a62c4b5-7585-428e-9498-525fa6c885b3" containerName="dnsmasq-dns" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.277803 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a62c4b5-7585-428e-9498-525fa6c885b3" containerName="dnsmasq-dns" Oct 01 12:58:14 crc kubenswrapper[4913]: E1001 12:58:14.277815 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a62c4b5-7585-428e-9498-525fa6c885b3" containerName="init" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.277823 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a62c4b5-7585-428e-9498-525fa6c885b3" containerName="init" Oct 01 12:58:14 crc kubenswrapper[4913]: E1001 12:58:14.277834 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c0278e-106e-4985-a279-d8c89f141b15" containerName="init" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.277840 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c0278e-106e-4985-a279-d8c89f141b15" containerName="init" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.278021 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a62c4b5-7585-428e-9498-525fa6c885b3" containerName="dnsmasq-dns" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.278042 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c0278e-106e-4985-a279-d8c89f141b15" containerName="dnsmasq-dns" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.278592 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.285743 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.285743 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.286103 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.286544 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.309103 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z"] Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.412977 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmnvm\" (UniqueName: \"kubernetes.io/projected/ff0f29f2-344b-41d8-aea2-7d29e013aeec-kube-api-access-mmnvm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.413157 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.413239 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.413355 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.515442 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.515509 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.515591 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.515663 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmnvm\" (UniqueName: \"kubernetes.io/projected/ff0f29f2-344b-41d8-aea2-7d29e013aeec-kube-api-access-mmnvm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.521226 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.521319 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.521750 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.536087 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmnvm\" (UniqueName: \"kubernetes.io/projected/ff0f29f2-344b-41d8-aea2-7d29e013aeec-kube-api-access-mmnvm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.602983 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.964203 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z"] Oct 01 12:58:14 crc kubenswrapper[4913]: I1001 12:58:14.973413 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:58:15 crc kubenswrapper[4913]: I1001 12:58:15.634097 4913 generic.go:334] "Generic (PLEG): container finished" podID="3e71ec0f-d0d7-40a1-b83c-20f0dc177473" containerID="d90b93733858a179a2b8520682f822c3228c70a850586dd69fa8232cdc3a3ad1" exitCode=0 Oct 01 12:58:15 crc kubenswrapper[4913]: I1001 12:58:15.634198 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3e71ec0f-d0d7-40a1-b83c-20f0dc177473","Type":"ContainerDied","Data":"d90b93733858a179a2b8520682f822c3228c70a850586dd69fa8232cdc3a3ad1"} Oct 01 12:58:15 crc kubenswrapper[4913]: I1001 12:58:15.635709 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" event={"ID":"ff0f29f2-344b-41d8-aea2-7d29e013aeec","Type":"ContainerStarted","Data":"3ca8e153f43c2b99914a6cf7938661fca219d61b919337595a1642233d9b631b"} Oct 01 12:58:15 crc kubenswrapper[4913]: I1001 12:58:15.638287 4913 generic.go:334] "Generic (PLEG): container finished" podID="7d60c488-fc39-4b3b-bd78-839f6975bcfa" containerID="6a220f379eac2f421a71c9956ab193ce8cae2ad26826c4cea85de58c24070f6d" exitCode=0 Oct 01 12:58:15 crc kubenswrapper[4913]: I1001 12:58:15.638334 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7d60c488-fc39-4b3b-bd78-839f6975bcfa","Type":"ContainerDied","Data":"6a220f379eac2f421a71c9956ab193ce8cae2ad26826c4cea85de58c24070f6d"} Oct 01 12:58:16 crc kubenswrapper[4913]: I1001 12:58:16.650416 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3e71ec0f-d0d7-40a1-b83c-20f0dc177473","Type":"ContainerStarted","Data":"a650a9ab42c535ef0684dd648af0a3d042d31fc368d8c41a8cb6f180c0632c4b"} Oct 01 12:58:16 crc kubenswrapper[4913]: I1001 12:58:16.651163 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 12:58:16 crc kubenswrapper[4913]: I1001 12:58:16.652202 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7d60c488-fc39-4b3b-bd78-839f6975bcfa","Type":"ContainerStarted","Data":"9b5f3dd059e62953e395ce5b50a6edae4785a8af61036287805683092b4ac270"} Oct 01 12:58:16 crc kubenswrapper[4913]: I1001 12:58:16.652671 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:58:16 crc kubenswrapper[4913]: I1001 12:58:16.679419 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.679402056 podStartE2EDuration="36.679402056s" podCreationTimestamp="2025-10-01 12:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:58:16.673002539 +0000 UTC m=+1228.576478127" watchObservedRunningTime="2025-10-01 12:58:16.679402056 +0000 UTC m=+1228.582877634" Oct 01 12:58:18 crc kubenswrapper[4913]: I1001 12:58:18.829882 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.829814006 podStartE2EDuration="38.829814006s" podCreationTimestamp="2025-10-01 12:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:58:16.697692909 +0000 UTC m=+1228.601168507" watchObservedRunningTime="2025-10-01 12:58:18.829814006 +0000 UTC m=+1230.733289584" Oct 01 12:58:30 crc kubenswrapper[4913]: E1001 12:58:30.728399 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:9774f19d7a63d6f516afa701fb5f031674ad537e595049bbc57817356c7642fe" Oct 01 12:58:30 crc kubenswrapper[4913]: E1001 12:58:30.729518 4913 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 01 12:58:30 crc kubenswrapper[4913]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:9774f19d7a63d6f516afa701fb5f031674ad537e595049bbc57817356c7642fe,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Oct 01 12:58:30 crc kubenswrapper[4913]: - hosts: all Oct 01 12:58:30 crc kubenswrapper[4913]: strategy: linear Oct 01 12:58:30 crc kubenswrapper[4913]: tasks: Oct 01 12:58:30 crc kubenswrapper[4913]: - name: Enable podified-repos Oct 01 12:58:30 crc kubenswrapper[4913]: become: true Oct 01 12:58:30 crc kubenswrapper[4913]: ansible.builtin.shell: | Oct 01 12:58:30 crc kubenswrapper[4913]: set -euxo pipefail Oct 01 12:58:30 crc kubenswrapper[4913]: pushd /var/tmp Oct 01 12:58:30 crc kubenswrapper[4913]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Oct 01 12:58:30 crc kubenswrapper[4913]: pushd repo-setup-main Oct 01 12:58:30 crc kubenswrapper[4913]: python3 -m venv ./venv Oct 01 12:58:30 crc kubenswrapper[4913]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Oct 01 12:58:30 crc kubenswrapper[4913]: ./venv/bin/repo-setup current-podified -b antelope Oct 01 12:58:30 crc kubenswrapper[4913]: popd Oct 01 12:58:30 crc kubenswrapper[4913]: rm -rf repo-setup-main Oct 01 12:58:30 crc kubenswrapper[4913]: Oct 01 12:58:30 crc kubenswrapper[4913]: Oct 01 12:58:30 crc kubenswrapper[4913]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Oct 01 12:58:30 crc kubenswrapper[4913]: edpm_override_hosts: openstack-edpm-ipam Oct 01 12:58:30 crc kubenswrapper[4913]: edpm_service_type: repo-setup Oct 01 12:58:30 crc kubenswrapper[4913]: Oct 01 12:58:30 crc kubenswrapper[4913]: Oct 01 12:58:30 crc kubenswrapper[4913]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/runner/env/ssh_key,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mmnvm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z_openstack(ff0f29f2-344b-41d8-aea2-7d29e013aeec): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Oct 01 12:58:30 crc kubenswrapper[4913]: > logger="UnhandledError" Oct 01 12:58:30 crc kubenswrapper[4913]: E1001 12:58:30.730689 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" podUID="ff0f29f2-344b-41d8-aea2-7d29e013aeec" Oct 01 12:58:30 crc kubenswrapper[4913]: I1001 12:58:30.735542 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:58:30 crc kubenswrapper[4913]: E1001 12:58:30.823028 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:9774f19d7a63d6f516afa701fb5f031674ad537e595049bbc57817356c7642fe\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" podUID="ff0f29f2-344b-41d8-aea2-7d29e013aeec" Oct 01 12:58:30 crc kubenswrapper[4913]: I1001 12:58:30.826250 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3e71ec0f-d0d7-40a1-b83c-20f0dc177473" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.196:5671: connect: connection refused" Oct 01 12:58:40 crc kubenswrapper[4913]: I1001 12:58:40.083519 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:58:40 crc kubenswrapper[4913]: I1001 12:58:40.083827 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:58:40 crc kubenswrapper[4913]: I1001 12:58:40.823595 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 12:58:45 crc kubenswrapper[4913]: I1001 12:58:45.962996 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" event={"ID":"ff0f29f2-344b-41d8-aea2-7d29e013aeec","Type":"ContainerStarted","Data":"2356606cf64cc9eeabdb4c4d2edd3914e766727182e8deafc0bf581c5f9cc03c"} Oct 01 12:58:45 crc kubenswrapper[4913]: I1001 12:58:45.981250 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" podStartSLOduration=1.655821335 podStartE2EDuration="31.981230911s" podCreationTimestamp="2025-10-01 12:58:14 +0000 UTC" firstStartedPulling="2025-10-01 12:58:14.973125184 +0000 UTC m=+1226.876600762" lastFinishedPulling="2025-10-01 12:58:45.29853475 +0000 UTC m=+1257.202010338" observedRunningTime="2025-10-01 12:58:45.980339037 +0000 UTC m=+1257.883814685" watchObservedRunningTime="2025-10-01 12:58:45.981230911 +0000 UTC m=+1257.884706489" Oct 01 12:58:58 crc kubenswrapper[4913]: I1001 12:58:58.087167 4913 generic.go:334] "Generic (PLEG): container finished" podID="ff0f29f2-344b-41d8-aea2-7d29e013aeec" containerID="2356606cf64cc9eeabdb4c4d2edd3914e766727182e8deafc0bf581c5f9cc03c" exitCode=0 Oct 01 12:58:58 crc kubenswrapper[4913]: I1001 12:58:58.087254 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" event={"ID":"ff0f29f2-344b-41d8-aea2-7d29e013aeec","Type":"ContainerDied","Data":"2356606cf64cc9eeabdb4c4d2edd3914e766727182e8deafc0bf581c5f9cc03c"} Oct 01 12:58:59 crc kubenswrapper[4913]: I1001 12:58:59.502623 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:58:59 crc kubenswrapper[4913]: I1001 12:58:59.624744 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmnvm\" (UniqueName: \"kubernetes.io/projected/ff0f29f2-344b-41d8-aea2-7d29e013aeec-kube-api-access-mmnvm\") pod \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " Oct 01 12:58:59 crc kubenswrapper[4913]: I1001 12:58:59.624878 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-inventory\") pod \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " Oct 01 12:58:59 crc kubenswrapper[4913]: I1001 12:58:59.624980 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-repo-setup-combined-ca-bundle\") pod \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " Oct 01 12:58:59 crc kubenswrapper[4913]: I1001 12:58:59.625066 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-ssh-key\") pod \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\" (UID: \"ff0f29f2-344b-41d8-aea2-7d29e013aeec\") " Oct 01 12:58:59 crc kubenswrapper[4913]: I1001 12:58:59.630265 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "ff0f29f2-344b-41d8-aea2-7d29e013aeec" (UID: "ff0f29f2-344b-41d8-aea2-7d29e013aeec"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:58:59 crc kubenswrapper[4913]: I1001 12:58:59.630344 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff0f29f2-344b-41d8-aea2-7d29e013aeec-kube-api-access-mmnvm" (OuterVolumeSpecName: "kube-api-access-mmnvm") pod "ff0f29f2-344b-41d8-aea2-7d29e013aeec" (UID: "ff0f29f2-344b-41d8-aea2-7d29e013aeec"). InnerVolumeSpecName "kube-api-access-mmnvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:58:59 crc kubenswrapper[4913]: I1001 12:58:59.650507 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ff0f29f2-344b-41d8-aea2-7d29e013aeec" (UID: "ff0f29f2-344b-41d8-aea2-7d29e013aeec"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:58:59 crc kubenswrapper[4913]: I1001 12:58:59.652030 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-inventory" (OuterVolumeSpecName: "inventory") pod "ff0f29f2-344b-41d8-aea2-7d29e013aeec" (UID: "ff0f29f2-344b-41d8-aea2-7d29e013aeec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:58:59 crc kubenswrapper[4913]: I1001 12:58:59.726740 4913 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:59 crc kubenswrapper[4913]: I1001 12:58:59.726772 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:59 crc kubenswrapper[4913]: I1001 12:58:59.726786 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmnvm\" (UniqueName: \"kubernetes.io/projected/ff0f29f2-344b-41d8-aea2-7d29e013aeec-kube-api-access-mmnvm\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:59 crc kubenswrapper[4913]: I1001 12:58:59.726797 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff0f29f2-344b-41d8-aea2-7d29e013aeec-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.108132 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" event={"ID":"ff0f29f2-344b-41d8-aea2-7d29e013aeec","Type":"ContainerDied","Data":"3ca8e153f43c2b99914a6cf7938661fca219d61b919337595a1642233d9b631b"} Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.108525 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ca8e153f43c2b99914a6cf7938661fca219d61b919337595a1642233d9b631b" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.108251 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.242011 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4"] Oct 01 12:59:00 crc kubenswrapper[4913]: E1001 12:59:00.246659 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0f29f2-344b-41d8-aea2-7d29e013aeec" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.246698 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0f29f2-344b-41d8-aea2-7d29e013aeec" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.247391 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0f29f2-344b-41d8-aea2-7d29e013aeec" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.248384 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.252167 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.259697 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.260247 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.261022 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.275758 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4"] Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.451140 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6zz5\" (UniqueName: \"kubernetes.io/projected/5f1c1c18-0832-49de-88ae-40f2bc5be31f-kube-api-access-f6zz5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.451253 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.452393 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.452493 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.555006 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.555108 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.555205 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.555400 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6zz5\" (UniqueName: \"kubernetes.io/projected/5f1c1c18-0832-49de-88ae-40f2bc5be31f-kube-api-access-f6zz5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.561109 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.562049 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.563096 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.575353 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6zz5\" (UniqueName: \"kubernetes.io/projected/5f1c1c18-0832-49de-88ae-40f2bc5be31f-kube-api-access-f6zz5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 12:59:00 crc kubenswrapper[4913]: I1001 12:59:00.874647 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 12:59:01 crc kubenswrapper[4913]: I1001 12:59:01.516878 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4"] Oct 01 12:59:02 crc kubenswrapper[4913]: I1001 12:59:02.129566 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" event={"ID":"5f1c1c18-0832-49de-88ae-40f2bc5be31f","Type":"ContainerStarted","Data":"e1511567a0f93feb0fa49bf3bdc8a03386cd5284c58277989af2f76d176537fd"} Oct 01 12:59:07 crc kubenswrapper[4913]: I1001 12:59:07.205280 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" event={"ID":"5f1c1c18-0832-49de-88ae-40f2bc5be31f","Type":"ContainerStarted","Data":"066f85645a28e552a720991dd457c68fdaebe90e2675cdbfeeed9f2ac7145ed6"} Oct 01 12:59:07 crc kubenswrapper[4913]: I1001 12:59:07.225550 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" podStartSLOduration=2.641447292 podStartE2EDuration="7.225530517s" podCreationTimestamp="2025-10-01 12:59:00 +0000 UTC" firstStartedPulling="2025-10-01 12:59:01.528666229 +0000 UTC m=+1273.432141807" lastFinishedPulling="2025-10-01 12:59:06.112749464 +0000 UTC m=+1278.016225032" observedRunningTime="2025-10-01 12:59:07.222392021 +0000 UTC m=+1279.125867619" watchObservedRunningTime="2025-10-01 12:59:07.225530517 +0000 UTC m=+1279.129006105" Oct 01 12:59:10 crc kubenswrapper[4913]: I1001 12:59:10.083651 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:59:10 crc kubenswrapper[4913]: I1001 12:59:10.084304 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:59:40 crc kubenswrapper[4913]: I1001 12:59:40.083498 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:59:40 crc kubenswrapper[4913]: I1001 12:59:40.084197 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:59:40 crc kubenswrapper[4913]: I1001 12:59:40.084257 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 12:59:40 crc kubenswrapper[4913]: I1001 12:59:40.085017 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd610d0107ae658b14a61d2241d34b3304551116724eb973af1cbc4a77d29ef1"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:59:40 crc kubenswrapper[4913]: I1001 12:59:40.085082 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://fd610d0107ae658b14a61d2241d34b3304551116724eb973af1cbc4a77d29ef1" gracePeriod=600 Oct 01 12:59:40 crc kubenswrapper[4913]: I1001 12:59:40.531883 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="fd610d0107ae658b14a61d2241d34b3304551116724eb973af1cbc4a77d29ef1" exitCode=0 Oct 01 12:59:40 crc kubenswrapper[4913]: I1001 12:59:40.531994 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"fd610d0107ae658b14a61d2241d34b3304551116724eb973af1cbc4a77d29ef1"} Oct 01 12:59:40 crc kubenswrapper[4913]: I1001 12:59:40.532228 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7"} Oct 01 12:59:40 crc kubenswrapper[4913]: I1001 12:59:40.532253 4913 scope.go:117] "RemoveContainer" containerID="b8ccdeaa9feae2c057a74d4a7cb5ae3ea008156e58af6b9bb65a4673a0aaa9d4" Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.156393 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw"] Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.158077 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.162229 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.162587 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.179089 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw"] Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.268789 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz4rx\" (UniqueName: \"kubernetes.io/projected/45859544-5cfd-44cb-8414-3c21d7256c2a-kube-api-access-rz4rx\") pod \"collect-profiles-29322060-8vzmw\" (UID: \"45859544-5cfd-44cb-8414-3c21d7256c2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.269158 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45859544-5cfd-44cb-8414-3c21d7256c2a-config-volume\") pod \"collect-profiles-29322060-8vzmw\" (UID: \"45859544-5cfd-44cb-8414-3c21d7256c2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.269339 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45859544-5cfd-44cb-8414-3c21d7256c2a-secret-volume\") pod \"collect-profiles-29322060-8vzmw\" (UID: \"45859544-5cfd-44cb-8414-3c21d7256c2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.371335 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45859544-5cfd-44cb-8414-3c21d7256c2a-secret-volume\") pod \"collect-profiles-29322060-8vzmw\" (UID: \"45859544-5cfd-44cb-8414-3c21d7256c2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.371493 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz4rx\" (UniqueName: \"kubernetes.io/projected/45859544-5cfd-44cb-8414-3c21d7256c2a-kube-api-access-rz4rx\") pod \"collect-profiles-29322060-8vzmw\" (UID: \"45859544-5cfd-44cb-8414-3c21d7256c2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.371603 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45859544-5cfd-44cb-8414-3c21d7256c2a-config-volume\") pod \"collect-profiles-29322060-8vzmw\" (UID: \"45859544-5cfd-44cb-8414-3c21d7256c2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.372677 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45859544-5cfd-44cb-8414-3c21d7256c2a-config-volume\") pod \"collect-profiles-29322060-8vzmw\" (UID: \"45859544-5cfd-44cb-8414-3c21d7256c2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.377694 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45859544-5cfd-44cb-8414-3c21d7256c2a-secret-volume\") pod \"collect-profiles-29322060-8vzmw\" (UID: \"45859544-5cfd-44cb-8414-3c21d7256c2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.391420 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz4rx\" (UniqueName: \"kubernetes.io/projected/45859544-5cfd-44cb-8414-3c21d7256c2a-kube-api-access-rz4rx\") pod \"collect-profiles-29322060-8vzmw\" (UID: \"45859544-5cfd-44cb-8414-3c21d7256c2a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.498715 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" Oct 01 13:00:00 crc kubenswrapper[4913]: I1001 13:00:00.931701 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw"] Oct 01 13:00:01 crc kubenswrapper[4913]: I1001 13:00:01.761523 4913 generic.go:334] "Generic (PLEG): container finished" podID="45859544-5cfd-44cb-8414-3c21d7256c2a" containerID="f7bb353c4fa40aeec972fd5d5327b3f88a4ac37c68cfe3a110296315779e7bb4" exitCode=0 Oct 01 13:00:01 crc kubenswrapper[4913]: I1001 13:00:01.761577 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" event={"ID":"45859544-5cfd-44cb-8414-3c21d7256c2a","Type":"ContainerDied","Data":"f7bb353c4fa40aeec972fd5d5327b3f88a4ac37c68cfe3a110296315779e7bb4"} Oct 01 13:00:01 crc kubenswrapper[4913]: I1001 13:00:01.761980 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" event={"ID":"45859544-5cfd-44cb-8414-3c21d7256c2a","Type":"ContainerStarted","Data":"aa893fb6fe1bd08788fa1be664a7baaa91ac5a0b33d0ebd3153cfaade9a45144"} Oct 01 13:00:03 crc kubenswrapper[4913]: I1001 13:00:03.070955 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" Oct 01 13:00:03 crc kubenswrapper[4913]: I1001 13:00:03.232833 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz4rx\" (UniqueName: \"kubernetes.io/projected/45859544-5cfd-44cb-8414-3c21d7256c2a-kube-api-access-rz4rx\") pod \"45859544-5cfd-44cb-8414-3c21d7256c2a\" (UID: \"45859544-5cfd-44cb-8414-3c21d7256c2a\") " Oct 01 13:00:03 crc kubenswrapper[4913]: I1001 13:00:03.233013 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45859544-5cfd-44cb-8414-3c21d7256c2a-secret-volume\") pod \"45859544-5cfd-44cb-8414-3c21d7256c2a\" (UID: \"45859544-5cfd-44cb-8414-3c21d7256c2a\") " Oct 01 13:00:03 crc kubenswrapper[4913]: I1001 13:00:03.233124 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45859544-5cfd-44cb-8414-3c21d7256c2a-config-volume\") pod \"45859544-5cfd-44cb-8414-3c21d7256c2a\" (UID: \"45859544-5cfd-44cb-8414-3c21d7256c2a\") " Oct 01 13:00:03 crc kubenswrapper[4913]: I1001 13:00:03.234293 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45859544-5cfd-44cb-8414-3c21d7256c2a-config-volume" (OuterVolumeSpecName: "config-volume") pod "45859544-5cfd-44cb-8414-3c21d7256c2a" (UID: "45859544-5cfd-44cb-8414-3c21d7256c2a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:00:03 crc kubenswrapper[4913]: I1001 13:00:03.241920 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45859544-5cfd-44cb-8414-3c21d7256c2a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "45859544-5cfd-44cb-8414-3c21d7256c2a" (UID: "45859544-5cfd-44cb-8414-3c21d7256c2a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:00:03 crc kubenswrapper[4913]: I1001 13:00:03.242820 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45859544-5cfd-44cb-8414-3c21d7256c2a-kube-api-access-rz4rx" (OuterVolumeSpecName: "kube-api-access-rz4rx") pod "45859544-5cfd-44cb-8414-3c21d7256c2a" (UID: "45859544-5cfd-44cb-8414-3c21d7256c2a"). InnerVolumeSpecName "kube-api-access-rz4rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:00:03 crc kubenswrapper[4913]: I1001 13:00:03.335505 4913 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45859544-5cfd-44cb-8414-3c21d7256c2a-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:00:03 crc kubenswrapper[4913]: I1001 13:00:03.335554 4913 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45859544-5cfd-44cb-8414-3c21d7256c2a-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:00:03 crc kubenswrapper[4913]: I1001 13:00:03.336080 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz4rx\" (UniqueName: \"kubernetes.io/projected/45859544-5cfd-44cb-8414-3c21d7256c2a-kube-api-access-rz4rx\") on node \"crc\" DevicePath \"\"" Oct 01 13:00:03 crc kubenswrapper[4913]: I1001 13:00:03.781702 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" event={"ID":"45859544-5cfd-44cb-8414-3c21d7256c2a","Type":"ContainerDied","Data":"aa893fb6fe1bd08788fa1be664a7baaa91ac5a0b33d0ebd3153cfaade9a45144"} Oct 01 13:00:03 crc kubenswrapper[4913]: I1001 13:00:03.782154 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa893fb6fe1bd08788fa1be664a7baaa91ac5a0b33d0ebd3153cfaade9a45144" Oct 01 13:00:03 crc kubenswrapper[4913]: I1001 13:00:03.781722 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw" Oct 01 13:00:03 crc kubenswrapper[4913]: I1001 13:00:03.959537 4913 scope.go:117] "RemoveContainer" containerID="ba645eac3745b09349271c20b4f7b9a1300165ad9616229a6de32a710cad40ed" Oct 01 13:00:04 crc kubenswrapper[4913]: I1001 13:00:04.014060 4913 scope.go:117] "RemoveContainer" containerID="cd8de2d4fdb47138d9c8b7c9ec4d871b0d65877ad8eef06ab76e7383fcc813db" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.152512 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29322061-r8fks"] Oct 01 13:01:00 crc kubenswrapper[4913]: E1001 13:01:00.153487 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45859544-5cfd-44cb-8414-3c21d7256c2a" containerName="collect-profiles" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.153506 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="45859544-5cfd-44cb-8414-3c21d7256c2a" containerName="collect-profiles" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.153684 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="45859544-5cfd-44cb-8414-3c21d7256c2a" containerName="collect-profiles" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.154789 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.165308 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322061-r8fks"] Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.216199 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-combined-ca-bundle\") pod \"keystone-cron-29322061-r8fks\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.216299 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-config-data\") pod \"keystone-cron-29322061-r8fks\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.216378 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-fernet-keys\") pod \"keystone-cron-29322061-r8fks\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.216651 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57nrz\" (UniqueName: \"kubernetes.io/projected/de7f94d4-57c9-4d92-8b60-678373217f05-kube-api-access-57nrz\") pod \"keystone-cron-29322061-r8fks\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.319422 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57nrz\" (UniqueName: \"kubernetes.io/projected/de7f94d4-57c9-4d92-8b60-678373217f05-kube-api-access-57nrz\") pod \"keystone-cron-29322061-r8fks\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.319532 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-combined-ca-bundle\") pod \"keystone-cron-29322061-r8fks\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.319583 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-config-data\") pod \"keystone-cron-29322061-r8fks\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.319643 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-fernet-keys\") pod \"keystone-cron-29322061-r8fks\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.327970 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-fernet-keys\") pod \"keystone-cron-29322061-r8fks\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.328171 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-combined-ca-bundle\") pod \"keystone-cron-29322061-r8fks\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.328920 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-config-data\") pod \"keystone-cron-29322061-r8fks\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.342819 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57nrz\" (UniqueName: \"kubernetes.io/projected/de7f94d4-57c9-4d92-8b60-678373217f05-kube-api-access-57nrz\") pod \"keystone-cron-29322061-r8fks\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.480429 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:00 crc kubenswrapper[4913]: I1001 13:01:00.906888 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322061-r8fks"] Oct 01 13:01:00 crc kubenswrapper[4913]: W1001 13:01:00.919330 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde7f94d4_57c9_4d92_8b60_678373217f05.slice/crio-664a1450fea23271a50865c8758cdd301bea130249833b1d789f9de92c771eb2 WatchSource:0}: Error finding container 664a1450fea23271a50865c8758cdd301bea130249833b1d789f9de92c771eb2: Status 404 returned error can't find the container with id 664a1450fea23271a50865c8758cdd301bea130249833b1d789f9de92c771eb2 Oct 01 13:01:01 crc kubenswrapper[4913]: I1001 13:01:01.329860 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322061-r8fks" event={"ID":"de7f94d4-57c9-4d92-8b60-678373217f05","Type":"ContainerStarted","Data":"30690ff62bb7cce8c504151c986ccd93ac25d6bf05bee363b58f2982fbd49002"} Oct 01 13:01:01 crc kubenswrapper[4913]: I1001 13:01:01.330173 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322061-r8fks" event={"ID":"de7f94d4-57c9-4d92-8b60-678373217f05","Type":"ContainerStarted","Data":"664a1450fea23271a50865c8758cdd301bea130249833b1d789f9de92c771eb2"} Oct 01 13:01:01 crc kubenswrapper[4913]: I1001 13:01:01.355116 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29322061-r8fks" podStartSLOduration=1.355095898 podStartE2EDuration="1.355095898s" podCreationTimestamp="2025-10-01 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:01:01.351954751 +0000 UTC m=+1393.255430349" watchObservedRunningTime="2025-10-01 13:01:01.355095898 +0000 UTC m=+1393.258571476" Oct 01 13:01:03 crc kubenswrapper[4913]: I1001 13:01:03.394823 4913 generic.go:334] "Generic (PLEG): container finished" podID="de7f94d4-57c9-4d92-8b60-678373217f05" containerID="30690ff62bb7cce8c504151c986ccd93ac25d6bf05bee363b58f2982fbd49002" exitCode=0 Oct 01 13:01:03 crc kubenswrapper[4913]: I1001 13:01:03.394930 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322061-r8fks" event={"ID":"de7f94d4-57c9-4d92-8b60-678373217f05","Type":"ContainerDied","Data":"30690ff62bb7cce8c504151c986ccd93ac25d6bf05bee363b58f2982fbd49002"} Oct 01 13:01:04 crc kubenswrapper[4913]: I1001 13:01:04.799617 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:04 crc kubenswrapper[4913]: I1001 13:01:04.921659 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-fernet-keys\") pod \"de7f94d4-57c9-4d92-8b60-678373217f05\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " Oct 01 13:01:04 crc kubenswrapper[4913]: I1001 13:01:04.921783 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-combined-ca-bundle\") pod \"de7f94d4-57c9-4d92-8b60-678373217f05\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " Oct 01 13:01:04 crc kubenswrapper[4913]: I1001 13:01:04.921864 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-config-data\") pod \"de7f94d4-57c9-4d92-8b60-678373217f05\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " Oct 01 13:01:04 crc kubenswrapper[4913]: I1001 13:01:04.921898 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57nrz\" (UniqueName: \"kubernetes.io/projected/de7f94d4-57c9-4d92-8b60-678373217f05-kube-api-access-57nrz\") pod \"de7f94d4-57c9-4d92-8b60-678373217f05\" (UID: \"de7f94d4-57c9-4d92-8b60-678373217f05\") " Oct 01 13:01:04 crc kubenswrapper[4913]: I1001 13:01:04.932451 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "de7f94d4-57c9-4d92-8b60-678373217f05" (UID: "de7f94d4-57c9-4d92-8b60-678373217f05"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:01:04 crc kubenswrapper[4913]: I1001 13:01:04.932620 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de7f94d4-57c9-4d92-8b60-678373217f05-kube-api-access-57nrz" (OuterVolumeSpecName: "kube-api-access-57nrz") pod "de7f94d4-57c9-4d92-8b60-678373217f05" (UID: "de7f94d4-57c9-4d92-8b60-678373217f05"). InnerVolumeSpecName "kube-api-access-57nrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:01:04 crc kubenswrapper[4913]: I1001 13:01:04.947725 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de7f94d4-57c9-4d92-8b60-678373217f05" (UID: "de7f94d4-57c9-4d92-8b60-678373217f05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:01:04 crc kubenswrapper[4913]: I1001 13:01:04.968593 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-config-data" (OuterVolumeSpecName: "config-data") pod "de7f94d4-57c9-4d92-8b60-678373217f05" (UID: "de7f94d4-57c9-4d92-8b60-678373217f05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:01:05 crc kubenswrapper[4913]: I1001 13:01:05.024096 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:05 crc kubenswrapper[4913]: I1001 13:01:05.024138 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:05 crc kubenswrapper[4913]: I1001 13:01:05.024154 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57nrz\" (UniqueName: \"kubernetes.io/projected/de7f94d4-57c9-4d92-8b60-678373217f05-kube-api-access-57nrz\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:05 crc kubenswrapper[4913]: I1001 13:01:05.024167 4913 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de7f94d4-57c9-4d92-8b60-678373217f05-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:05 crc kubenswrapper[4913]: I1001 13:01:05.421763 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322061-r8fks" event={"ID":"de7f94d4-57c9-4d92-8b60-678373217f05","Type":"ContainerDied","Data":"664a1450fea23271a50865c8758cdd301bea130249833b1d789f9de92c771eb2"} Oct 01 13:01:05 crc kubenswrapper[4913]: I1001 13:01:05.421827 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="664a1450fea23271a50865c8758cdd301bea130249833b1d789f9de92c771eb2" Oct 01 13:01:05 crc kubenswrapper[4913]: I1001 13:01:05.421840 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322061-r8fks" Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.107659 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4drbd"] Oct 01 13:01:39 crc kubenswrapper[4913]: E1001 13:01:39.108674 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de7f94d4-57c9-4d92-8b60-678373217f05" containerName="keystone-cron" Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.108688 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="de7f94d4-57c9-4d92-8b60-678373217f05" containerName="keystone-cron" Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.108908 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="de7f94d4-57c9-4d92-8b60-678373217f05" containerName="keystone-cron" Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.110793 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.124643 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4drbd"] Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.227149 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8623ef26-df0d-4c57-b01f-52d381326e82-catalog-content\") pod \"certified-operators-4drbd\" (UID: \"8623ef26-df0d-4c57-b01f-52d381326e82\") " pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.227182 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8623ef26-df0d-4c57-b01f-52d381326e82-utilities\") pod \"certified-operators-4drbd\" (UID: \"8623ef26-df0d-4c57-b01f-52d381326e82\") " pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.227300 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb7kd\" (UniqueName: \"kubernetes.io/projected/8623ef26-df0d-4c57-b01f-52d381326e82-kube-api-access-zb7kd\") pod \"certified-operators-4drbd\" (UID: \"8623ef26-df0d-4c57-b01f-52d381326e82\") " pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.328165 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8623ef26-df0d-4c57-b01f-52d381326e82-catalog-content\") pod \"certified-operators-4drbd\" (UID: \"8623ef26-df0d-4c57-b01f-52d381326e82\") " pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.328204 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8623ef26-df0d-4c57-b01f-52d381326e82-utilities\") pod \"certified-operators-4drbd\" (UID: \"8623ef26-df0d-4c57-b01f-52d381326e82\") " pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.328300 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb7kd\" (UniqueName: \"kubernetes.io/projected/8623ef26-df0d-4c57-b01f-52d381326e82-kube-api-access-zb7kd\") pod \"certified-operators-4drbd\" (UID: \"8623ef26-df0d-4c57-b01f-52d381326e82\") " pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.328747 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8623ef26-df0d-4c57-b01f-52d381326e82-utilities\") pod \"certified-operators-4drbd\" (UID: \"8623ef26-df0d-4c57-b01f-52d381326e82\") " pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.328819 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8623ef26-df0d-4c57-b01f-52d381326e82-catalog-content\") pod \"certified-operators-4drbd\" (UID: \"8623ef26-df0d-4c57-b01f-52d381326e82\") " pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.347043 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb7kd\" (UniqueName: \"kubernetes.io/projected/8623ef26-df0d-4c57-b01f-52d381326e82-kube-api-access-zb7kd\") pod \"certified-operators-4drbd\" (UID: \"8623ef26-df0d-4c57-b01f-52d381326e82\") " pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:39 crc kubenswrapper[4913]: I1001 13:01:39.460836 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:40 crc kubenswrapper[4913]: I1001 13:01:40.012182 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4drbd"] Oct 01 13:01:40 crc kubenswrapper[4913]: I1001 13:01:40.083844 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:01:40 crc kubenswrapper[4913]: I1001 13:01:40.083901 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:01:40 crc kubenswrapper[4913]: I1001 13:01:40.750761 4913 generic.go:334] "Generic (PLEG): container finished" podID="8623ef26-df0d-4c57-b01f-52d381326e82" containerID="35634d621094d310b7a9ba4d3a3f4e15126eaa2a5dec2bedb96adf05c822740e" exitCode=0 Oct 01 13:01:40 crc kubenswrapper[4913]: I1001 13:01:40.750882 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4drbd" event={"ID":"8623ef26-df0d-4c57-b01f-52d381326e82","Type":"ContainerDied","Data":"35634d621094d310b7a9ba4d3a3f4e15126eaa2a5dec2bedb96adf05c822740e"} Oct 01 13:01:40 crc kubenswrapper[4913]: I1001 13:01:40.750966 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4drbd" event={"ID":"8623ef26-df0d-4c57-b01f-52d381326e82","Type":"ContainerStarted","Data":"66e10e8c98561e3ab6dd20c0834560e2762f95c8796a16632b5aa7203dd946c4"} Oct 01 13:01:42 crc kubenswrapper[4913]: I1001 13:01:42.769958 4913 generic.go:334] "Generic (PLEG): container finished" podID="8623ef26-df0d-4c57-b01f-52d381326e82" containerID="d787db636dd6551aba50caf19839750707b9178884b0b5a13925cdf0c2d79973" exitCode=0 Oct 01 13:01:42 crc kubenswrapper[4913]: I1001 13:01:42.770011 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4drbd" event={"ID":"8623ef26-df0d-4c57-b01f-52d381326e82","Type":"ContainerDied","Data":"d787db636dd6551aba50caf19839750707b9178884b0b5a13925cdf0c2d79973"} Oct 01 13:01:43 crc kubenswrapper[4913]: I1001 13:01:43.786137 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4drbd" event={"ID":"8623ef26-df0d-4c57-b01f-52d381326e82","Type":"ContainerStarted","Data":"22e87cc603539f224fdd5511c8efd3ebce08f720d394858245d99a305cf96b87"} Oct 01 13:01:43 crc kubenswrapper[4913]: I1001 13:01:43.814258 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4drbd" podStartSLOduration=2.118486893 podStartE2EDuration="4.814232281s" podCreationTimestamp="2025-10-01 13:01:39 +0000 UTC" firstStartedPulling="2025-10-01 13:01:40.752144214 +0000 UTC m=+1432.655619822" lastFinishedPulling="2025-10-01 13:01:43.447889582 +0000 UTC m=+1435.351365210" observedRunningTime="2025-10-01 13:01:43.81204035 +0000 UTC m=+1435.715515938" watchObservedRunningTime="2025-10-01 13:01:43.814232281 +0000 UTC m=+1435.717707899" Oct 01 13:01:48 crc kubenswrapper[4913]: I1001 13:01:48.139987 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7kdjj"] Oct 01 13:01:48 crc kubenswrapper[4913]: I1001 13:01:48.142336 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:48 crc kubenswrapper[4913]: I1001 13:01:48.153636 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kdjj"] Oct 01 13:01:48 crc kubenswrapper[4913]: I1001 13:01:48.304111 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thqdz\" (UniqueName: \"kubernetes.io/projected/9697b347-802d-434b-a4b2-e775dfcfee43-kube-api-access-thqdz\") pod \"redhat-marketplace-7kdjj\" (UID: \"9697b347-802d-434b-a4b2-e775dfcfee43\") " pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:48 crc kubenswrapper[4913]: I1001 13:01:48.304215 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9697b347-802d-434b-a4b2-e775dfcfee43-utilities\") pod \"redhat-marketplace-7kdjj\" (UID: \"9697b347-802d-434b-a4b2-e775dfcfee43\") " pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:48 crc kubenswrapper[4913]: I1001 13:01:48.304332 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9697b347-802d-434b-a4b2-e775dfcfee43-catalog-content\") pod \"redhat-marketplace-7kdjj\" (UID: \"9697b347-802d-434b-a4b2-e775dfcfee43\") " pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:48 crc kubenswrapper[4913]: I1001 13:01:48.406090 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thqdz\" (UniqueName: \"kubernetes.io/projected/9697b347-802d-434b-a4b2-e775dfcfee43-kube-api-access-thqdz\") pod \"redhat-marketplace-7kdjj\" (UID: \"9697b347-802d-434b-a4b2-e775dfcfee43\") " pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:48 crc kubenswrapper[4913]: I1001 13:01:48.406243 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9697b347-802d-434b-a4b2-e775dfcfee43-utilities\") pod \"redhat-marketplace-7kdjj\" (UID: \"9697b347-802d-434b-a4b2-e775dfcfee43\") " pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:48 crc kubenswrapper[4913]: I1001 13:01:48.406336 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9697b347-802d-434b-a4b2-e775dfcfee43-catalog-content\") pod \"redhat-marketplace-7kdjj\" (UID: \"9697b347-802d-434b-a4b2-e775dfcfee43\") " pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:48 crc kubenswrapper[4913]: I1001 13:01:48.406989 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9697b347-802d-434b-a4b2-e775dfcfee43-utilities\") pod \"redhat-marketplace-7kdjj\" (UID: \"9697b347-802d-434b-a4b2-e775dfcfee43\") " pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:48 crc kubenswrapper[4913]: I1001 13:01:48.407006 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9697b347-802d-434b-a4b2-e775dfcfee43-catalog-content\") pod \"redhat-marketplace-7kdjj\" (UID: \"9697b347-802d-434b-a4b2-e775dfcfee43\") " pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:48 crc kubenswrapper[4913]: I1001 13:01:48.432458 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thqdz\" (UniqueName: \"kubernetes.io/projected/9697b347-802d-434b-a4b2-e775dfcfee43-kube-api-access-thqdz\") pod \"redhat-marketplace-7kdjj\" (UID: \"9697b347-802d-434b-a4b2-e775dfcfee43\") " pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:48 crc kubenswrapper[4913]: I1001 13:01:48.497745 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:48 crc kubenswrapper[4913]: I1001 13:01:48.930023 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kdjj"] Oct 01 13:01:49 crc kubenswrapper[4913]: I1001 13:01:49.461445 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:49 crc kubenswrapper[4913]: I1001 13:01:49.461562 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:49 crc kubenswrapper[4913]: I1001 13:01:49.506856 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:49 crc kubenswrapper[4913]: I1001 13:01:49.837179 4913 generic.go:334] "Generic (PLEG): container finished" podID="9697b347-802d-434b-a4b2-e775dfcfee43" containerID="2b11ac133ab939ed560bb4e7f3558281b1ac2f3029aee1afa2beb7b03cbc4283" exitCode=0 Oct 01 13:01:49 crc kubenswrapper[4913]: I1001 13:01:49.837292 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kdjj" event={"ID":"9697b347-802d-434b-a4b2-e775dfcfee43","Type":"ContainerDied","Data":"2b11ac133ab939ed560bb4e7f3558281b1ac2f3029aee1afa2beb7b03cbc4283"} Oct 01 13:01:49 crc kubenswrapper[4913]: I1001 13:01:49.837558 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kdjj" event={"ID":"9697b347-802d-434b-a4b2-e775dfcfee43","Type":"ContainerStarted","Data":"cf31ac5d81beefcad656b6e9fd9f58b8ce091aee2fba7faa49c6a6803ecd557a"} Oct 01 13:01:49 crc kubenswrapper[4913]: I1001 13:01:49.888252 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:50 crc kubenswrapper[4913]: I1001 13:01:50.846854 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kdjj" event={"ID":"9697b347-802d-434b-a4b2-e775dfcfee43","Type":"ContainerStarted","Data":"ab5f5611ae4afa5fe03f1bbc9d4ed11cd83ccb5aeba597e6db2e1829c769d63e"} Oct 01 13:01:51 crc kubenswrapper[4913]: I1001 13:01:51.860531 4913 generic.go:334] "Generic (PLEG): container finished" podID="9697b347-802d-434b-a4b2-e775dfcfee43" containerID="ab5f5611ae4afa5fe03f1bbc9d4ed11cd83ccb5aeba597e6db2e1829c769d63e" exitCode=0 Oct 01 13:01:51 crc kubenswrapper[4913]: I1001 13:01:51.860707 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kdjj" event={"ID":"9697b347-802d-434b-a4b2-e775dfcfee43","Type":"ContainerDied","Data":"ab5f5611ae4afa5fe03f1bbc9d4ed11cd83ccb5aeba597e6db2e1829c769d63e"} Oct 01 13:01:51 crc kubenswrapper[4913]: I1001 13:01:51.909133 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4drbd"] Oct 01 13:01:51 crc kubenswrapper[4913]: I1001 13:01:51.909414 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4drbd" podUID="8623ef26-df0d-4c57-b01f-52d381326e82" containerName="registry-server" containerID="cri-o://22e87cc603539f224fdd5511c8efd3ebce08f720d394858245d99a305cf96b87" gracePeriod=2 Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.367006 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.389030 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb7kd\" (UniqueName: \"kubernetes.io/projected/8623ef26-df0d-4c57-b01f-52d381326e82-kube-api-access-zb7kd\") pod \"8623ef26-df0d-4c57-b01f-52d381326e82\" (UID: \"8623ef26-df0d-4c57-b01f-52d381326e82\") " Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.389121 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8623ef26-df0d-4c57-b01f-52d381326e82-utilities\") pod \"8623ef26-df0d-4c57-b01f-52d381326e82\" (UID: \"8623ef26-df0d-4c57-b01f-52d381326e82\") " Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.389324 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8623ef26-df0d-4c57-b01f-52d381326e82-catalog-content\") pod \"8623ef26-df0d-4c57-b01f-52d381326e82\" (UID: \"8623ef26-df0d-4c57-b01f-52d381326e82\") " Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.390036 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8623ef26-df0d-4c57-b01f-52d381326e82-utilities" (OuterVolumeSpecName: "utilities") pod "8623ef26-df0d-4c57-b01f-52d381326e82" (UID: "8623ef26-df0d-4c57-b01f-52d381326e82"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.396943 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8623ef26-df0d-4c57-b01f-52d381326e82-kube-api-access-zb7kd" (OuterVolumeSpecName: "kube-api-access-zb7kd") pod "8623ef26-df0d-4c57-b01f-52d381326e82" (UID: "8623ef26-df0d-4c57-b01f-52d381326e82"). InnerVolumeSpecName "kube-api-access-zb7kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.434361 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8623ef26-df0d-4c57-b01f-52d381326e82-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8623ef26-df0d-4c57-b01f-52d381326e82" (UID: "8623ef26-df0d-4c57-b01f-52d381326e82"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.491204 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8623ef26-df0d-4c57-b01f-52d381326e82-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.491238 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb7kd\" (UniqueName: \"kubernetes.io/projected/8623ef26-df0d-4c57-b01f-52d381326e82-kube-api-access-zb7kd\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.491253 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8623ef26-df0d-4c57-b01f-52d381326e82-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.882339 4913 generic.go:334] "Generic (PLEG): container finished" podID="8623ef26-df0d-4c57-b01f-52d381326e82" containerID="22e87cc603539f224fdd5511c8efd3ebce08f720d394858245d99a305cf96b87" exitCode=0 Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.882409 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4drbd" event={"ID":"8623ef26-df0d-4c57-b01f-52d381326e82","Type":"ContainerDied","Data":"22e87cc603539f224fdd5511c8efd3ebce08f720d394858245d99a305cf96b87"} Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.882426 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4drbd" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.882438 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4drbd" event={"ID":"8623ef26-df0d-4c57-b01f-52d381326e82","Type":"ContainerDied","Data":"66e10e8c98561e3ab6dd20c0834560e2762f95c8796a16632b5aa7203dd946c4"} Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.882458 4913 scope.go:117] "RemoveContainer" containerID="22e87cc603539f224fdd5511c8efd3ebce08f720d394858245d99a305cf96b87" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.890073 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kdjj" event={"ID":"9697b347-802d-434b-a4b2-e775dfcfee43","Type":"ContainerStarted","Data":"9ff9a244503780831bed6500a9ee0055b2afef5f9380289a22d87d602dacc46c"} Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.913093 4913 scope.go:117] "RemoveContainer" containerID="d787db636dd6551aba50caf19839750707b9178884b0b5a13925cdf0c2d79973" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.916548 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4drbd"] Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.934329 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4drbd"] Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.937079 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7kdjj" podStartSLOduration=2.457188663 podStartE2EDuration="4.937062816s" podCreationTimestamp="2025-10-01 13:01:48 +0000 UTC" firstStartedPulling="2025-10-01 13:01:49.838681805 +0000 UTC m=+1441.742157373" lastFinishedPulling="2025-10-01 13:01:52.318555938 +0000 UTC m=+1444.222031526" observedRunningTime="2025-10-01 13:01:52.923235154 +0000 UTC m=+1444.826710742" watchObservedRunningTime="2025-10-01 13:01:52.937062816 +0000 UTC m=+1444.840538394" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.942203 4913 scope.go:117] "RemoveContainer" containerID="35634d621094d310b7a9ba4d3a3f4e15126eaa2a5dec2bedb96adf05c822740e" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.961636 4913 scope.go:117] "RemoveContainer" containerID="22e87cc603539f224fdd5511c8efd3ebce08f720d394858245d99a305cf96b87" Oct 01 13:01:52 crc kubenswrapper[4913]: E1001 13:01:52.961990 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e87cc603539f224fdd5511c8efd3ebce08f720d394858245d99a305cf96b87\": container with ID starting with 22e87cc603539f224fdd5511c8efd3ebce08f720d394858245d99a305cf96b87 not found: ID does not exist" containerID="22e87cc603539f224fdd5511c8efd3ebce08f720d394858245d99a305cf96b87" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.962020 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e87cc603539f224fdd5511c8efd3ebce08f720d394858245d99a305cf96b87"} err="failed to get container status \"22e87cc603539f224fdd5511c8efd3ebce08f720d394858245d99a305cf96b87\": rpc error: code = NotFound desc = could not find container \"22e87cc603539f224fdd5511c8efd3ebce08f720d394858245d99a305cf96b87\": container with ID starting with 22e87cc603539f224fdd5511c8efd3ebce08f720d394858245d99a305cf96b87 not found: ID does not exist" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.962041 4913 scope.go:117] "RemoveContainer" containerID="d787db636dd6551aba50caf19839750707b9178884b0b5a13925cdf0c2d79973" Oct 01 13:01:52 crc kubenswrapper[4913]: E1001 13:01:52.962260 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d787db636dd6551aba50caf19839750707b9178884b0b5a13925cdf0c2d79973\": container with ID starting with d787db636dd6551aba50caf19839750707b9178884b0b5a13925cdf0c2d79973 not found: ID does not exist" containerID="d787db636dd6551aba50caf19839750707b9178884b0b5a13925cdf0c2d79973" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.962294 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d787db636dd6551aba50caf19839750707b9178884b0b5a13925cdf0c2d79973"} err="failed to get container status \"d787db636dd6551aba50caf19839750707b9178884b0b5a13925cdf0c2d79973\": rpc error: code = NotFound desc = could not find container \"d787db636dd6551aba50caf19839750707b9178884b0b5a13925cdf0c2d79973\": container with ID starting with d787db636dd6551aba50caf19839750707b9178884b0b5a13925cdf0c2d79973 not found: ID does not exist" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.962305 4913 scope.go:117] "RemoveContainer" containerID="35634d621094d310b7a9ba4d3a3f4e15126eaa2a5dec2bedb96adf05c822740e" Oct 01 13:01:52 crc kubenswrapper[4913]: E1001 13:01:52.962512 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35634d621094d310b7a9ba4d3a3f4e15126eaa2a5dec2bedb96adf05c822740e\": container with ID starting with 35634d621094d310b7a9ba4d3a3f4e15126eaa2a5dec2bedb96adf05c822740e not found: ID does not exist" containerID="35634d621094d310b7a9ba4d3a3f4e15126eaa2a5dec2bedb96adf05c822740e" Oct 01 13:01:52 crc kubenswrapper[4913]: I1001 13:01:52.962530 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35634d621094d310b7a9ba4d3a3f4e15126eaa2a5dec2bedb96adf05c822740e"} err="failed to get container status \"35634d621094d310b7a9ba4d3a3f4e15126eaa2a5dec2bedb96adf05c822740e\": rpc error: code = NotFound desc = could not find container \"35634d621094d310b7a9ba4d3a3f4e15126eaa2a5dec2bedb96adf05c822740e\": container with ID starting with 35634d621094d310b7a9ba4d3a3f4e15126eaa2a5dec2bedb96adf05c822740e not found: ID does not exist" Oct 01 13:01:54 crc kubenswrapper[4913]: I1001 13:01:54.821655 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8623ef26-df0d-4c57-b01f-52d381326e82" path="/var/lib/kubelet/pods/8623ef26-df0d-4c57-b01f-52d381326e82/volumes" Oct 01 13:01:58 crc kubenswrapper[4913]: I1001 13:01:58.498507 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:58 crc kubenswrapper[4913]: I1001 13:01:58.498807 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:58 crc kubenswrapper[4913]: I1001 13:01:58.569092 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:59 crc kubenswrapper[4913]: I1001 13:01:59.009116 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:01:59 crc kubenswrapper[4913]: I1001 13:01:59.068576 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kdjj"] Oct 01 13:02:00 crc kubenswrapper[4913]: I1001 13:02:00.997761 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7kdjj" podUID="9697b347-802d-434b-a4b2-e775dfcfee43" containerName="registry-server" containerID="cri-o://9ff9a244503780831bed6500a9ee0055b2afef5f9380289a22d87d602dacc46c" gracePeriod=2 Oct 01 13:02:01 crc kubenswrapper[4913]: I1001 13:02:01.579182 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:02:01 crc kubenswrapper[4913]: I1001 13:02:01.763524 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9697b347-802d-434b-a4b2-e775dfcfee43-utilities\") pod \"9697b347-802d-434b-a4b2-e775dfcfee43\" (UID: \"9697b347-802d-434b-a4b2-e775dfcfee43\") " Oct 01 13:02:01 crc kubenswrapper[4913]: I1001 13:02:01.763571 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9697b347-802d-434b-a4b2-e775dfcfee43-catalog-content\") pod \"9697b347-802d-434b-a4b2-e775dfcfee43\" (UID: \"9697b347-802d-434b-a4b2-e775dfcfee43\") " Oct 01 13:02:01 crc kubenswrapper[4913]: I1001 13:02:01.763687 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thqdz\" (UniqueName: \"kubernetes.io/projected/9697b347-802d-434b-a4b2-e775dfcfee43-kube-api-access-thqdz\") pod \"9697b347-802d-434b-a4b2-e775dfcfee43\" (UID: \"9697b347-802d-434b-a4b2-e775dfcfee43\") " Oct 01 13:02:01 crc kubenswrapper[4913]: I1001 13:02:01.764586 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9697b347-802d-434b-a4b2-e775dfcfee43-utilities" (OuterVolumeSpecName: "utilities") pod "9697b347-802d-434b-a4b2-e775dfcfee43" (UID: "9697b347-802d-434b-a4b2-e775dfcfee43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:02:01 crc kubenswrapper[4913]: I1001 13:02:01.764924 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9697b347-802d-434b-a4b2-e775dfcfee43-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:02:01 crc kubenswrapper[4913]: I1001 13:02:01.771670 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9697b347-802d-434b-a4b2-e775dfcfee43-kube-api-access-thqdz" (OuterVolumeSpecName: "kube-api-access-thqdz") pod "9697b347-802d-434b-a4b2-e775dfcfee43" (UID: "9697b347-802d-434b-a4b2-e775dfcfee43"). InnerVolumeSpecName "kube-api-access-thqdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:02:01 crc kubenswrapper[4913]: I1001 13:02:01.798181 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9697b347-802d-434b-a4b2-e775dfcfee43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9697b347-802d-434b-a4b2-e775dfcfee43" (UID: "9697b347-802d-434b-a4b2-e775dfcfee43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:02:01 crc kubenswrapper[4913]: I1001 13:02:01.866682 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9697b347-802d-434b-a4b2-e775dfcfee43-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:02:01 crc kubenswrapper[4913]: I1001 13:02:01.866802 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thqdz\" (UniqueName: \"kubernetes.io/projected/9697b347-802d-434b-a4b2-e775dfcfee43-kube-api-access-thqdz\") on node \"crc\" DevicePath \"\"" Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.010538 4913 generic.go:334] "Generic (PLEG): container finished" podID="9697b347-802d-434b-a4b2-e775dfcfee43" containerID="9ff9a244503780831bed6500a9ee0055b2afef5f9380289a22d87d602dacc46c" exitCode=0 Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.010705 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kdjj" event={"ID":"9697b347-802d-434b-a4b2-e775dfcfee43","Type":"ContainerDied","Data":"9ff9a244503780831bed6500a9ee0055b2afef5f9380289a22d87d602dacc46c"} Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.010945 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7kdjj" Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.010966 4913 scope.go:117] "RemoveContainer" containerID="9ff9a244503780831bed6500a9ee0055b2afef5f9380289a22d87d602dacc46c" Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.010947 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kdjj" event={"ID":"9697b347-802d-434b-a4b2-e775dfcfee43","Type":"ContainerDied","Data":"cf31ac5d81beefcad656b6e9fd9f58b8ce091aee2fba7faa49c6a6803ecd557a"} Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.042241 4913 scope.go:117] "RemoveContainer" containerID="ab5f5611ae4afa5fe03f1bbc9d4ed11cd83ccb5aeba597e6db2e1829c769d63e" Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.061653 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kdjj"] Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.078506 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kdjj"] Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.079983 4913 scope.go:117] "RemoveContainer" containerID="2b11ac133ab939ed560bb4e7f3558281b1ac2f3029aee1afa2beb7b03cbc4283" Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.123888 4913 scope.go:117] "RemoveContainer" containerID="9ff9a244503780831bed6500a9ee0055b2afef5f9380289a22d87d602dacc46c" Oct 01 13:02:02 crc kubenswrapper[4913]: E1001 13:02:02.124409 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ff9a244503780831bed6500a9ee0055b2afef5f9380289a22d87d602dacc46c\": container with ID starting with 9ff9a244503780831bed6500a9ee0055b2afef5f9380289a22d87d602dacc46c not found: ID does not exist" containerID="9ff9a244503780831bed6500a9ee0055b2afef5f9380289a22d87d602dacc46c" Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.124459 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ff9a244503780831bed6500a9ee0055b2afef5f9380289a22d87d602dacc46c"} err="failed to get container status \"9ff9a244503780831bed6500a9ee0055b2afef5f9380289a22d87d602dacc46c\": rpc error: code = NotFound desc = could not find container \"9ff9a244503780831bed6500a9ee0055b2afef5f9380289a22d87d602dacc46c\": container with ID starting with 9ff9a244503780831bed6500a9ee0055b2afef5f9380289a22d87d602dacc46c not found: ID does not exist" Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.124496 4913 scope.go:117] "RemoveContainer" containerID="ab5f5611ae4afa5fe03f1bbc9d4ed11cd83ccb5aeba597e6db2e1829c769d63e" Oct 01 13:02:02 crc kubenswrapper[4913]: E1001 13:02:02.124949 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab5f5611ae4afa5fe03f1bbc9d4ed11cd83ccb5aeba597e6db2e1829c769d63e\": container with ID starting with ab5f5611ae4afa5fe03f1bbc9d4ed11cd83ccb5aeba597e6db2e1829c769d63e not found: ID does not exist" containerID="ab5f5611ae4afa5fe03f1bbc9d4ed11cd83ccb5aeba597e6db2e1829c769d63e" Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.124998 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab5f5611ae4afa5fe03f1bbc9d4ed11cd83ccb5aeba597e6db2e1829c769d63e"} err="failed to get container status \"ab5f5611ae4afa5fe03f1bbc9d4ed11cd83ccb5aeba597e6db2e1829c769d63e\": rpc error: code = NotFound desc = could not find container \"ab5f5611ae4afa5fe03f1bbc9d4ed11cd83ccb5aeba597e6db2e1829c769d63e\": container with ID starting with ab5f5611ae4afa5fe03f1bbc9d4ed11cd83ccb5aeba597e6db2e1829c769d63e not found: ID does not exist" Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.125035 4913 scope.go:117] "RemoveContainer" containerID="2b11ac133ab939ed560bb4e7f3558281b1ac2f3029aee1afa2beb7b03cbc4283" Oct 01 13:02:02 crc kubenswrapper[4913]: E1001 13:02:02.125548 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b11ac133ab939ed560bb4e7f3558281b1ac2f3029aee1afa2beb7b03cbc4283\": container with ID starting with 2b11ac133ab939ed560bb4e7f3558281b1ac2f3029aee1afa2beb7b03cbc4283 not found: ID does not exist" containerID="2b11ac133ab939ed560bb4e7f3558281b1ac2f3029aee1afa2beb7b03cbc4283" Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.125662 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b11ac133ab939ed560bb4e7f3558281b1ac2f3029aee1afa2beb7b03cbc4283"} err="failed to get container status \"2b11ac133ab939ed560bb4e7f3558281b1ac2f3029aee1afa2beb7b03cbc4283\": rpc error: code = NotFound desc = could not find container \"2b11ac133ab939ed560bb4e7f3558281b1ac2f3029aee1afa2beb7b03cbc4283\": container with ID starting with 2b11ac133ab939ed560bb4e7f3558281b1ac2f3029aee1afa2beb7b03cbc4283 not found: ID does not exist" Oct 01 13:02:02 crc kubenswrapper[4913]: I1001 13:02:02.827937 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9697b347-802d-434b-a4b2-e775dfcfee43" path="/var/lib/kubelet/pods/9697b347-802d-434b-a4b2-e775dfcfee43/volumes" Oct 01 13:02:06 crc kubenswrapper[4913]: I1001 13:02:06.052452 4913 generic.go:334] "Generic (PLEG): container finished" podID="5f1c1c18-0832-49de-88ae-40f2bc5be31f" containerID="066f85645a28e552a720991dd457c68fdaebe90e2675cdbfeeed9f2ac7145ed6" exitCode=0 Oct 01 13:02:06 crc kubenswrapper[4913]: I1001 13:02:06.052566 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" event={"ID":"5f1c1c18-0832-49de-88ae-40f2bc5be31f","Type":"ContainerDied","Data":"066f85645a28e552a720991dd457c68fdaebe90e2675cdbfeeed9f2ac7145ed6"} Oct 01 13:02:07 crc kubenswrapper[4913]: I1001 13:02:07.456583 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 13:02:07 crc kubenswrapper[4913]: I1001 13:02:07.596969 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-ssh-key\") pod \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " Oct 01 13:02:07 crc kubenswrapper[4913]: I1001 13:02:07.597076 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-inventory\") pod \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " Oct 01 13:02:07 crc kubenswrapper[4913]: I1001 13:02:07.597126 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6zz5\" (UniqueName: \"kubernetes.io/projected/5f1c1c18-0832-49de-88ae-40f2bc5be31f-kube-api-access-f6zz5\") pod \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " Oct 01 13:02:07 crc kubenswrapper[4913]: I1001 13:02:07.597356 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-bootstrap-combined-ca-bundle\") pod \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\" (UID: \"5f1c1c18-0832-49de-88ae-40f2bc5be31f\") " Oct 01 13:02:07 crc kubenswrapper[4913]: I1001 13:02:07.603097 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f1c1c18-0832-49de-88ae-40f2bc5be31f-kube-api-access-f6zz5" (OuterVolumeSpecName: "kube-api-access-f6zz5") pod "5f1c1c18-0832-49de-88ae-40f2bc5be31f" (UID: "5f1c1c18-0832-49de-88ae-40f2bc5be31f"). InnerVolumeSpecName "kube-api-access-f6zz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:02:07 crc kubenswrapper[4913]: I1001 13:02:07.604143 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5f1c1c18-0832-49de-88ae-40f2bc5be31f" (UID: "5f1c1c18-0832-49de-88ae-40f2bc5be31f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:02:07 crc kubenswrapper[4913]: I1001 13:02:07.628036 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-inventory" (OuterVolumeSpecName: "inventory") pod "5f1c1c18-0832-49de-88ae-40f2bc5be31f" (UID: "5f1c1c18-0832-49de-88ae-40f2bc5be31f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:02:07 crc kubenswrapper[4913]: I1001 13:02:07.639258 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5f1c1c18-0832-49de-88ae-40f2bc5be31f" (UID: "5f1c1c18-0832-49de-88ae-40f2bc5be31f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:02:07 crc kubenswrapper[4913]: I1001 13:02:07.700255 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:02:07 crc kubenswrapper[4913]: I1001 13:02:07.700316 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:02:07 crc kubenswrapper[4913]: I1001 13:02:07.700332 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6zz5\" (UniqueName: \"kubernetes.io/projected/5f1c1c18-0832-49de-88ae-40f2bc5be31f-kube-api-access-f6zz5\") on node \"crc\" DevicePath \"\"" Oct 01 13:02:07 crc kubenswrapper[4913]: I1001 13:02:07.700346 4913 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1c1c18-0832-49de-88ae-40f2bc5be31f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.077429 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" event={"ID":"5f1c1c18-0832-49de-88ae-40f2bc5be31f","Type":"ContainerDied","Data":"e1511567a0f93feb0fa49bf3bdc8a03386cd5284c58277989af2f76d176537fd"} Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.077474 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.077479 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1511567a0f93feb0fa49bf3bdc8a03386cd5284c58277989af2f76d176537fd" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.216759 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb"] Oct 01 13:02:08 crc kubenswrapper[4913]: E1001 13:02:08.217194 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1c1c18-0832-49de-88ae-40f2bc5be31f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.217216 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1c1c18-0832-49de-88ae-40f2bc5be31f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:02:08 crc kubenswrapper[4913]: E1001 13:02:08.217236 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9697b347-802d-434b-a4b2-e775dfcfee43" containerName="registry-server" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.217244 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9697b347-802d-434b-a4b2-e775dfcfee43" containerName="registry-server" Oct 01 13:02:08 crc kubenswrapper[4913]: E1001 13:02:08.217289 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8623ef26-df0d-4c57-b01f-52d381326e82" containerName="extract-utilities" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.217298 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8623ef26-df0d-4c57-b01f-52d381326e82" containerName="extract-utilities" Oct 01 13:02:08 crc kubenswrapper[4913]: E1001 13:02:08.217309 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8623ef26-df0d-4c57-b01f-52d381326e82" containerName="extract-content" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.217316 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8623ef26-df0d-4c57-b01f-52d381326e82" containerName="extract-content" Oct 01 13:02:08 crc kubenswrapper[4913]: E1001 13:02:08.217329 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8623ef26-df0d-4c57-b01f-52d381326e82" containerName="registry-server" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.217336 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8623ef26-df0d-4c57-b01f-52d381326e82" containerName="registry-server" Oct 01 13:02:08 crc kubenswrapper[4913]: E1001 13:02:08.217351 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9697b347-802d-434b-a4b2-e775dfcfee43" containerName="extract-utilities" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.217358 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9697b347-802d-434b-a4b2-e775dfcfee43" containerName="extract-utilities" Oct 01 13:02:08 crc kubenswrapper[4913]: E1001 13:02:08.217380 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9697b347-802d-434b-a4b2-e775dfcfee43" containerName="extract-content" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.217387 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9697b347-802d-434b-a4b2-e775dfcfee43" containerName="extract-content" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.217583 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f1c1c18-0832-49de-88ae-40f2bc5be31f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.217599 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="9697b347-802d-434b-a4b2-e775dfcfee43" containerName="registry-server" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.217623 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="8623ef26-df0d-4c57-b01f-52d381326e82" containerName="registry-server" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.218333 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.224705 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.224924 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.225083 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.225211 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.225798 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb"] Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.411117 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb\" (UID: \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.411165 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2q86\" (UniqueName: \"kubernetes.io/projected/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-kube-api-access-w2q86\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb\" (UID: \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.411345 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb\" (UID: \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.512756 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb\" (UID: \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.512807 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb\" (UID: \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.512838 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2q86\" (UniqueName: \"kubernetes.io/projected/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-kube-api-access-w2q86\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb\" (UID: \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.519302 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb\" (UID: \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.520492 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb\" (UID: \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.543260 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2q86\" (UniqueName: \"kubernetes.io/projected/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-kube-api-access-w2q86\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb\" (UID: \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" Oct 01 13:02:08 crc kubenswrapper[4913]: I1001 13:02:08.570527 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" Oct 01 13:02:09 crc kubenswrapper[4913]: I1001 13:02:09.139955 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb"] Oct 01 13:02:10 crc kubenswrapper[4913]: I1001 13:02:10.084139 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:02:10 crc kubenswrapper[4913]: I1001 13:02:10.084580 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:02:10 crc kubenswrapper[4913]: I1001 13:02:10.097428 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" event={"ID":"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c","Type":"ContainerStarted","Data":"378edc9f10c714ad8b9ae64357adec5085cee223828e2ab2ecf3ef2a1e764ffc"} Oct 01 13:02:10 crc kubenswrapper[4913]: I1001 13:02:10.097482 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" event={"ID":"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c","Type":"ContainerStarted","Data":"0078059719d5ba2a066d4b2261d214199fdcdc09b86d67901ebea5e494b75f16"} Oct 01 13:02:10 crc kubenswrapper[4913]: I1001 13:02:10.127862 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" podStartSLOduration=1.5776913879999999 podStartE2EDuration="2.127835064s" podCreationTimestamp="2025-10-01 13:02:08 +0000 UTC" firstStartedPulling="2025-10-01 13:02:09.138206074 +0000 UTC m=+1461.041681652" lastFinishedPulling="2025-10-01 13:02:09.68834975 +0000 UTC m=+1461.591825328" observedRunningTime="2025-10-01 13:02:10.121411555 +0000 UTC m=+1462.024887163" watchObservedRunningTime="2025-10-01 13:02:10.127835064 +0000 UTC m=+1462.031310682" Oct 01 13:02:31 crc kubenswrapper[4913]: I1001 13:02:31.743738 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tl7mr"] Oct 01 13:02:31 crc kubenswrapper[4913]: I1001 13:02:31.748993 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:02:31 crc kubenswrapper[4913]: I1001 13:02:31.753686 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tl7mr"] Oct 01 13:02:31 crc kubenswrapper[4913]: I1001 13:02:31.934351 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad95641-9502-4c02-93aa-f77003a85ebb-catalog-content\") pod \"redhat-operators-tl7mr\" (UID: \"1ad95641-9502-4c02-93aa-f77003a85ebb\") " pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:02:31 crc kubenswrapper[4913]: I1001 13:02:31.935644 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad95641-9502-4c02-93aa-f77003a85ebb-utilities\") pod \"redhat-operators-tl7mr\" (UID: \"1ad95641-9502-4c02-93aa-f77003a85ebb\") " pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:02:31 crc kubenswrapper[4913]: I1001 13:02:31.935844 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h46lc\" (UniqueName: \"kubernetes.io/projected/1ad95641-9502-4c02-93aa-f77003a85ebb-kube-api-access-h46lc\") pod \"redhat-operators-tl7mr\" (UID: \"1ad95641-9502-4c02-93aa-f77003a85ebb\") " pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:02:32 crc kubenswrapper[4913]: I1001 13:02:32.037476 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad95641-9502-4c02-93aa-f77003a85ebb-catalog-content\") pod \"redhat-operators-tl7mr\" (UID: \"1ad95641-9502-4c02-93aa-f77003a85ebb\") " pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:02:32 crc kubenswrapper[4913]: I1001 13:02:32.037588 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad95641-9502-4c02-93aa-f77003a85ebb-utilities\") pod \"redhat-operators-tl7mr\" (UID: \"1ad95641-9502-4c02-93aa-f77003a85ebb\") " pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:02:32 crc kubenswrapper[4913]: I1001 13:02:32.037688 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h46lc\" (UniqueName: \"kubernetes.io/projected/1ad95641-9502-4c02-93aa-f77003a85ebb-kube-api-access-h46lc\") pod \"redhat-operators-tl7mr\" (UID: \"1ad95641-9502-4c02-93aa-f77003a85ebb\") " pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:02:32 crc kubenswrapper[4913]: I1001 13:02:32.038573 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad95641-9502-4c02-93aa-f77003a85ebb-catalog-content\") pod \"redhat-operators-tl7mr\" (UID: \"1ad95641-9502-4c02-93aa-f77003a85ebb\") " pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:02:32 crc kubenswrapper[4913]: I1001 13:02:32.038721 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad95641-9502-4c02-93aa-f77003a85ebb-utilities\") pod \"redhat-operators-tl7mr\" (UID: \"1ad95641-9502-4c02-93aa-f77003a85ebb\") " pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:02:32 crc kubenswrapper[4913]: I1001 13:02:32.061511 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h46lc\" (UniqueName: \"kubernetes.io/projected/1ad95641-9502-4c02-93aa-f77003a85ebb-kube-api-access-h46lc\") pod \"redhat-operators-tl7mr\" (UID: \"1ad95641-9502-4c02-93aa-f77003a85ebb\") " pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:02:32 crc kubenswrapper[4913]: I1001 13:02:32.083519 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:02:32 crc kubenswrapper[4913]: W1001 13:02:32.523258 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ad95641_9502_4c02_93aa_f77003a85ebb.slice/crio-182faeb363c4756715932e0d2cc21916175b0325e5737021513dccf497319579 WatchSource:0}: Error finding container 182faeb363c4756715932e0d2cc21916175b0325e5737021513dccf497319579: Status 404 returned error can't find the container with id 182faeb363c4756715932e0d2cc21916175b0325e5737021513dccf497319579 Oct 01 13:02:32 crc kubenswrapper[4913]: I1001 13:02:32.523645 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tl7mr"] Oct 01 13:02:33 crc kubenswrapper[4913]: I1001 13:02:33.312484 4913 generic.go:334] "Generic (PLEG): container finished" podID="1ad95641-9502-4c02-93aa-f77003a85ebb" containerID="cf4fd6a5e390f0440d976ef5dee08573fee9317462558326e7fa26b557179ecc" exitCode=0 Oct 01 13:02:33 crc kubenswrapper[4913]: I1001 13:02:33.312600 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tl7mr" event={"ID":"1ad95641-9502-4c02-93aa-f77003a85ebb","Type":"ContainerDied","Data":"cf4fd6a5e390f0440d976ef5dee08573fee9317462558326e7fa26b557179ecc"} Oct 01 13:02:33 crc kubenswrapper[4913]: I1001 13:02:33.312800 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tl7mr" event={"ID":"1ad95641-9502-4c02-93aa-f77003a85ebb","Type":"ContainerStarted","Data":"182faeb363c4756715932e0d2cc21916175b0325e5737021513dccf497319579"} Oct 01 13:02:40 crc kubenswrapper[4913]: I1001 13:02:40.084106 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:02:40 crc kubenswrapper[4913]: I1001 13:02:40.084753 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:02:40 crc kubenswrapper[4913]: I1001 13:02:40.084809 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 13:02:40 crc kubenswrapper[4913]: I1001 13:02:40.086473 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:02:40 crc kubenswrapper[4913]: I1001 13:02:40.086700 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" gracePeriod=600 Oct 01 13:02:40 crc kubenswrapper[4913]: I1001 13:02:40.380710 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" exitCode=0 Oct 01 13:02:40 crc kubenswrapper[4913]: I1001 13:02:40.380753 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7"} Oct 01 13:02:40 crc kubenswrapper[4913]: I1001 13:02:40.380785 4913 scope.go:117] "RemoveContainer" containerID="fd610d0107ae658b14a61d2241d34b3304551116724eb973af1cbc4a77d29ef1" Oct 01 13:02:40 crc kubenswrapper[4913]: E1001 13:02:40.703659 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:02:41 crc kubenswrapper[4913]: I1001 13:02:41.390013 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:02:41 crc kubenswrapper[4913]: E1001 13:02:41.390350 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:02:48 crc kubenswrapper[4913]: I1001 13:02:48.454964 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tl7mr" event={"ID":"1ad95641-9502-4c02-93aa-f77003a85ebb","Type":"ContainerStarted","Data":"4f04ac923f7e0ac5ec15f2c5c2a2f033757b92125876878fac04d543e428e6b0"} Oct 01 13:02:49 crc kubenswrapper[4913]: I1001 13:02:49.464209 4913 generic.go:334] "Generic (PLEG): container finished" podID="1ad95641-9502-4c02-93aa-f77003a85ebb" containerID="4f04ac923f7e0ac5ec15f2c5c2a2f033757b92125876878fac04d543e428e6b0" exitCode=0 Oct 01 13:02:49 crc kubenswrapper[4913]: I1001 13:02:49.464319 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tl7mr" event={"ID":"1ad95641-9502-4c02-93aa-f77003a85ebb","Type":"ContainerDied","Data":"4f04ac923f7e0ac5ec15f2c5c2a2f033757b92125876878fac04d543e428e6b0"} Oct 01 13:02:51 crc kubenswrapper[4913]: I1001 13:02:51.482945 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tl7mr" event={"ID":"1ad95641-9502-4c02-93aa-f77003a85ebb","Type":"ContainerStarted","Data":"2511f2ccbf413739394bcb8c17702a1a7e533f59c1bd22db0c2e10304e045f64"} Oct 01 13:02:51 crc kubenswrapper[4913]: I1001 13:02:51.507597 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tl7mr" podStartSLOduration=3.29346985 podStartE2EDuration="20.507578423s" podCreationTimestamp="2025-10-01 13:02:31 +0000 UTC" firstStartedPulling="2025-10-01 13:02:33.315729929 +0000 UTC m=+1485.219205507" lastFinishedPulling="2025-10-01 13:02:50.529838502 +0000 UTC m=+1502.433314080" observedRunningTime="2025-10-01 13:02:51.503232812 +0000 UTC m=+1503.406708410" watchObservedRunningTime="2025-10-01 13:02:51.507578423 +0000 UTC m=+1503.411054001" Oct 01 13:02:51 crc kubenswrapper[4913]: I1001 13:02:51.806595 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:02:51 crc kubenswrapper[4913]: E1001 13:02:51.807127 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:02:52 crc kubenswrapper[4913]: I1001 13:02:52.084103 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:02:52 crc kubenswrapper[4913]: I1001 13:02:52.084160 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:02:53 crc kubenswrapper[4913]: I1001 13:02:53.139767 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tl7mr" podUID="1ad95641-9502-4c02-93aa-f77003a85ebb" containerName="registry-server" probeResult="failure" output=< Oct 01 13:02:53 crc kubenswrapper[4913]: timeout: failed to connect service ":50051" within 1s Oct 01 13:02:53 crc kubenswrapper[4913]: > Oct 01 13:03:02 crc kubenswrapper[4913]: I1001 13:03:02.127242 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:03:02 crc kubenswrapper[4913]: I1001 13:03:02.178573 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tl7mr" Oct 01 13:03:02 crc kubenswrapper[4913]: I1001 13:03:02.824223 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tl7mr"] Oct 01 13:03:02 crc kubenswrapper[4913]: I1001 13:03:02.942840 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rmfsv"] Oct 01 13:03:02 crc kubenswrapper[4913]: I1001 13:03:02.943057 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rmfsv" podUID="0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" containerName="registry-server" containerID="cri-o://49bbc5287fd65ee2b200e7f79525aeb22bdb2947f52164747dcfeee598f660b0" gracePeriod=2 Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.512652 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.610029 4913 generic.go:334] "Generic (PLEG): container finished" podID="0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" containerID="49bbc5287fd65ee2b200e7f79525aeb22bdb2947f52164747dcfeee598f660b0" exitCode=0 Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.610124 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rmfsv" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.610142 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmfsv" event={"ID":"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1","Type":"ContainerDied","Data":"49bbc5287fd65ee2b200e7f79525aeb22bdb2947f52164747dcfeee598f660b0"} Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.610218 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmfsv" event={"ID":"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1","Type":"ContainerDied","Data":"1e3695d895137762bb9f4aa110bdd78e1ee6f7e14432af73fc297b0a61a6e03b"} Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.610238 4913 scope.go:117] "RemoveContainer" containerID="49bbc5287fd65ee2b200e7f79525aeb22bdb2947f52164747dcfeee598f660b0" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.615877 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-utilities\") pod \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\" (UID: \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\") " Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.615934 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzj26\" (UniqueName: \"kubernetes.io/projected/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-kube-api-access-tzj26\") pod \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\" (UID: \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\") " Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.616049 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-catalog-content\") pod \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\" (UID: \"0135ef10-a28c-42de-bc9c-bdc0cd20e8e1\") " Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.617800 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-utilities" (OuterVolumeSpecName: "utilities") pod "0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" (UID: "0135ef10-a28c-42de-bc9c-bdc0cd20e8e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.637374 4913 scope.go:117] "RemoveContainer" containerID="b6539e816951278b52c58a646f80f9a437811a81ea7ecf0f4110b022418a5a9d" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.657147 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-kube-api-access-tzj26" (OuterVolumeSpecName: "kube-api-access-tzj26") pod "0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" (UID: "0135ef10-a28c-42de-bc9c-bdc0cd20e8e1"). InnerVolumeSpecName "kube-api-access-tzj26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.697836 4913 scope.go:117] "RemoveContainer" containerID="a7490f0b87c3d63418e19d97ccaafd7b957892ed220ad3d3f88391879233436e" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.718811 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.718848 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzj26\" (UniqueName: \"kubernetes.io/projected/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-kube-api-access-tzj26\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.726403 4913 scope.go:117] "RemoveContainer" containerID="49bbc5287fd65ee2b200e7f79525aeb22bdb2947f52164747dcfeee598f660b0" Oct 01 13:03:03 crc kubenswrapper[4913]: E1001 13:03:03.726861 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49bbc5287fd65ee2b200e7f79525aeb22bdb2947f52164747dcfeee598f660b0\": container with ID starting with 49bbc5287fd65ee2b200e7f79525aeb22bdb2947f52164747dcfeee598f660b0 not found: ID does not exist" containerID="49bbc5287fd65ee2b200e7f79525aeb22bdb2947f52164747dcfeee598f660b0" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.726905 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49bbc5287fd65ee2b200e7f79525aeb22bdb2947f52164747dcfeee598f660b0"} err="failed to get container status \"49bbc5287fd65ee2b200e7f79525aeb22bdb2947f52164747dcfeee598f660b0\": rpc error: code = NotFound desc = could not find container \"49bbc5287fd65ee2b200e7f79525aeb22bdb2947f52164747dcfeee598f660b0\": container with ID starting with 49bbc5287fd65ee2b200e7f79525aeb22bdb2947f52164747dcfeee598f660b0 not found: ID does not exist" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.726933 4913 scope.go:117] "RemoveContainer" containerID="b6539e816951278b52c58a646f80f9a437811a81ea7ecf0f4110b022418a5a9d" Oct 01 13:03:03 crc kubenswrapper[4913]: E1001 13:03:03.727385 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6539e816951278b52c58a646f80f9a437811a81ea7ecf0f4110b022418a5a9d\": container with ID starting with b6539e816951278b52c58a646f80f9a437811a81ea7ecf0f4110b022418a5a9d not found: ID does not exist" containerID="b6539e816951278b52c58a646f80f9a437811a81ea7ecf0f4110b022418a5a9d" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.727442 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6539e816951278b52c58a646f80f9a437811a81ea7ecf0f4110b022418a5a9d"} err="failed to get container status \"b6539e816951278b52c58a646f80f9a437811a81ea7ecf0f4110b022418a5a9d\": rpc error: code = NotFound desc = could not find container \"b6539e816951278b52c58a646f80f9a437811a81ea7ecf0f4110b022418a5a9d\": container with ID starting with b6539e816951278b52c58a646f80f9a437811a81ea7ecf0f4110b022418a5a9d not found: ID does not exist" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.727476 4913 scope.go:117] "RemoveContainer" containerID="a7490f0b87c3d63418e19d97ccaafd7b957892ed220ad3d3f88391879233436e" Oct 01 13:03:03 crc kubenswrapper[4913]: E1001 13:03:03.728102 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7490f0b87c3d63418e19d97ccaafd7b957892ed220ad3d3f88391879233436e\": container with ID starting with a7490f0b87c3d63418e19d97ccaafd7b957892ed220ad3d3f88391879233436e not found: ID does not exist" containerID="a7490f0b87c3d63418e19d97ccaafd7b957892ed220ad3d3f88391879233436e" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.728132 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7490f0b87c3d63418e19d97ccaafd7b957892ed220ad3d3f88391879233436e"} err="failed to get container status \"a7490f0b87c3d63418e19d97ccaafd7b957892ed220ad3d3f88391879233436e\": rpc error: code = NotFound desc = could not find container \"a7490f0b87c3d63418e19d97ccaafd7b957892ed220ad3d3f88391879233436e\": container with ID starting with a7490f0b87c3d63418e19d97ccaafd7b957892ed220ad3d3f88391879233436e not found: ID does not exist" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.735507 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" (UID: "0135ef10-a28c-42de-bc9c-bdc0cd20e8e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:03:03 crc kubenswrapper[4913]: I1001 13:03:03.820866 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:04 crc kubenswrapper[4913]: I1001 13:03:04.054668 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rmfsv"] Oct 01 13:03:04 crc kubenswrapper[4913]: I1001 13:03:04.062824 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rmfsv"] Oct 01 13:03:04 crc kubenswrapper[4913]: I1001 13:03:04.807372 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:03:04 crc kubenswrapper[4913]: E1001 13:03:04.807711 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:03:04 crc kubenswrapper[4913]: I1001 13:03:04.819675 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" path="/var/lib/kubelet/pods/0135ef10-a28c-42de-bc9c-bdc0cd20e8e1/volumes" Oct 01 13:03:14 crc kubenswrapper[4913]: I1001 13:03:14.043344 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-g8njq"] Oct 01 13:03:14 crc kubenswrapper[4913]: I1001 13:03:14.051321 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-5b585"] Oct 01 13:03:14 crc kubenswrapper[4913]: I1001 13:03:14.063698 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-5b585"] Oct 01 13:03:14 crc kubenswrapper[4913]: I1001 13:03:14.071691 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-g8njq"] Oct 01 13:03:14 crc kubenswrapper[4913]: I1001 13:03:14.824398 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0410901f-4718-42a7-9e61-a64722c67b5c" path="/var/lib/kubelet/pods/0410901f-4718-42a7-9e61-a64722c67b5c/volumes" Oct 01 13:03:14 crc kubenswrapper[4913]: I1001 13:03:14.825255 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e" path="/var/lib/kubelet/pods/f2cae9e2-ba7f-48a1-b95f-4813bca1ea6e/volumes" Oct 01 13:03:16 crc kubenswrapper[4913]: I1001 13:03:16.806736 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:03:16 crc kubenswrapper[4913]: E1001 13:03:16.807358 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:03:19 crc kubenswrapper[4913]: I1001 13:03:19.739092 4913 generic.go:334] "Generic (PLEG): container finished" podID="cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c" containerID="378edc9f10c714ad8b9ae64357adec5085cee223828e2ab2ecf3ef2a1e764ffc" exitCode=0 Oct 01 13:03:19 crc kubenswrapper[4913]: I1001 13:03:19.739174 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" event={"ID":"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c","Type":"ContainerDied","Data":"378edc9f10c714ad8b9ae64357adec5085cee223828e2ab2ecf3ef2a1e764ffc"} Oct 01 13:03:20 crc kubenswrapper[4913]: I1001 13:03:20.025677 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-wbw8j"] Oct 01 13:03:20 crc kubenswrapper[4913]: I1001 13:03:20.035770 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-wbw8j"] Oct 01 13:03:20 crc kubenswrapper[4913]: I1001 13:03:20.832748 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def990e9-e3a6-44f5-9a22-fcab13b131b0" path="/var/lib/kubelet/pods/def990e9-e3a6-44f5-9a22-fcab13b131b0/volumes" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.162163 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.348329 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2q86\" (UniqueName: \"kubernetes.io/projected/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-kube-api-access-w2q86\") pod \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\" (UID: \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\") " Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.348406 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-inventory\") pod \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\" (UID: \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\") " Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.348684 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-ssh-key\") pod \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\" (UID: \"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c\") " Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.353532 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-kube-api-access-w2q86" (OuterVolumeSpecName: "kube-api-access-w2q86") pod "cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c" (UID: "cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c"). InnerVolumeSpecName "kube-api-access-w2q86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.383761 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c" (UID: "cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.386031 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-inventory" (OuterVolumeSpecName: "inventory") pod "cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c" (UID: "cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.450593 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.450624 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2q86\" (UniqueName: \"kubernetes.io/projected/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-kube-api-access-w2q86\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.450636 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.757680 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" event={"ID":"cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c","Type":"ContainerDied","Data":"0078059719d5ba2a066d4b2261d214199fdcdc09b86d67901ebea5e494b75f16"} Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.757947 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0078059719d5ba2a066d4b2261d214199fdcdc09b86d67901ebea5e494b75f16" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.757735 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.855116 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7"] Oct 01 13:03:21 crc kubenswrapper[4913]: E1001 13:03:21.855519 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" containerName="extract-content" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.855534 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" containerName="extract-content" Oct 01 13:03:21 crc kubenswrapper[4913]: E1001 13:03:21.855551 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" containerName="extract-utilities" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.855559 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" containerName="extract-utilities" Oct 01 13:03:21 crc kubenswrapper[4913]: E1001 13:03:21.855574 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" containerName="registry-server" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.855582 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" containerName="registry-server" Oct 01 13:03:21 crc kubenswrapper[4913]: E1001 13:03:21.855608 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.855616 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.855836 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="0135ef10-a28c-42de-bc9c-bdc0cd20e8e1" containerName="registry-server" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.855862 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.856562 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.859838 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.862610 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.862968 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.865988 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.869622 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7"] Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.959301 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf8wb\" (UniqueName: \"kubernetes.io/projected/e576334d-7925-4127-b1b9-2d613614437f-kube-api-access-rf8wb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7\" (UID: \"e576334d-7925-4127-b1b9-2d613614437f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.959377 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e576334d-7925-4127-b1b9-2d613614437f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7\" (UID: \"e576334d-7925-4127-b1b9-2d613614437f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" Oct 01 13:03:21 crc kubenswrapper[4913]: I1001 13:03:21.959419 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e576334d-7925-4127-b1b9-2d613614437f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7\" (UID: \"e576334d-7925-4127-b1b9-2d613614437f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" Oct 01 13:03:22 crc kubenswrapper[4913]: I1001 13:03:22.061633 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e576334d-7925-4127-b1b9-2d613614437f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7\" (UID: \"e576334d-7925-4127-b1b9-2d613614437f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" Oct 01 13:03:22 crc kubenswrapper[4913]: I1001 13:03:22.061793 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf8wb\" (UniqueName: \"kubernetes.io/projected/e576334d-7925-4127-b1b9-2d613614437f-kube-api-access-rf8wb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7\" (UID: \"e576334d-7925-4127-b1b9-2d613614437f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" Oct 01 13:03:22 crc kubenswrapper[4913]: I1001 13:03:22.061828 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e576334d-7925-4127-b1b9-2d613614437f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7\" (UID: \"e576334d-7925-4127-b1b9-2d613614437f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" Oct 01 13:03:22 crc kubenswrapper[4913]: I1001 13:03:22.065663 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e576334d-7925-4127-b1b9-2d613614437f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7\" (UID: \"e576334d-7925-4127-b1b9-2d613614437f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" Oct 01 13:03:22 crc kubenswrapper[4913]: I1001 13:03:22.067018 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e576334d-7925-4127-b1b9-2d613614437f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7\" (UID: \"e576334d-7925-4127-b1b9-2d613614437f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" Oct 01 13:03:22 crc kubenswrapper[4913]: I1001 13:03:22.086374 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf8wb\" (UniqueName: \"kubernetes.io/projected/e576334d-7925-4127-b1b9-2d613614437f-kube-api-access-rf8wb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7\" (UID: \"e576334d-7925-4127-b1b9-2d613614437f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" Oct 01 13:03:22 crc kubenswrapper[4913]: I1001 13:03:22.178885 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" Oct 01 13:03:22 crc kubenswrapper[4913]: I1001 13:03:22.756002 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7"] Oct 01 13:03:22 crc kubenswrapper[4913]: I1001 13:03:22.762856 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:03:22 crc kubenswrapper[4913]: I1001 13:03:22.774372 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" event={"ID":"e576334d-7925-4127-b1b9-2d613614437f","Type":"ContainerStarted","Data":"9bc3281a9ced5eb81f3bf2f1fc8187999497f2a0de17a3d8b06ea8ad1a0209b2"} Oct 01 13:03:23 crc kubenswrapper[4913]: I1001 13:03:23.784050 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" event={"ID":"e576334d-7925-4127-b1b9-2d613614437f","Type":"ContainerStarted","Data":"21710f16105cc155ca639d83af3cb105eeaf4904cbe8499254132c8af953e217"} Oct 01 13:03:23 crc kubenswrapper[4913]: I1001 13:03:23.817433 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" podStartSLOduration=2.299446723 podStartE2EDuration="2.817410638s" podCreationTimestamp="2025-10-01 13:03:21 +0000 UTC" firstStartedPulling="2025-10-01 13:03:22.762574925 +0000 UTC m=+1534.666050513" lastFinishedPulling="2025-10-01 13:03:23.28053885 +0000 UTC m=+1535.184014428" observedRunningTime="2025-10-01 13:03:23.810693013 +0000 UTC m=+1535.714168671" watchObservedRunningTime="2025-10-01 13:03:23.817410638 +0000 UTC m=+1535.720886256" Oct 01 13:03:24 crc kubenswrapper[4913]: I1001 13:03:24.051328 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8b7e-account-create-p85hr"] Oct 01 13:03:24 crc kubenswrapper[4913]: I1001 13:03:24.063331 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8b7e-account-create-p85hr"] Oct 01 13:03:24 crc kubenswrapper[4913]: I1001 13:03:24.077972 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-154d-account-create-kmjfl"] Oct 01 13:03:24 crc kubenswrapper[4913]: I1001 13:03:24.093266 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-154d-account-create-kmjfl"] Oct 01 13:03:24 crc kubenswrapper[4913]: I1001 13:03:24.816947 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792a476e-74a0-4af7-be72-f496b501e22f" path="/var/lib/kubelet/pods/792a476e-74a0-4af7-be72-f496b501e22f/volumes" Oct 01 13:03:24 crc kubenswrapper[4913]: I1001 13:03:24.817500 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afbc2f4d-f870-4b0e-a484-4feefbf89762" path="/var/lib/kubelet/pods/afbc2f4d-f870-4b0e-a484-4feefbf89762/volumes" Oct 01 13:03:28 crc kubenswrapper[4913]: I1001 13:03:28.831510 4913 generic.go:334] "Generic (PLEG): container finished" podID="e576334d-7925-4127-b1b9-2d613614437f" containerID="21710f16105cc155ca639d83af3cb105eeaf4904cbe8499254132c8af953e217" exitCode=0 Oct 01 13:03:28 crc kubenswrapper[4913]: I1001 13:03:28.831553 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" event={"ID":"e576334d-7925-4127-b1b9-2d613614437f","Type":"ContainerDied","Data":"21710f16105cc155ca639d83af3cb105eeaf4904cbe8499254132c8af953e217"} Oct 01 13:03:29 crc kubenswrapper[4913]: I1001 13:03:29.807548 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:03:29 crc kubenswrapper[4913]: E1001 13:03:29.807979 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.033927 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-72e2-account-create-xdl2c"] Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.042511 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-72e2-account-create-xdl2c"] Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.273333 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.313111 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e576334d-7925-4127-b1b9-2d613614437f-inventory\") pod \"e576334d-7925-4127-b1b9-2d613614437f\" (UID: \"e576334d-7925-4127-b1b9-2d613614437f\") " Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.313161 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e576334d-7925-4127-b1b9-2d613614437f-ssh-key\") pod \"e576334d-7925-4127-b1b9-2d613614437f\" (UID: \"e576334d-7925-4127-b1b9-2d613614437f\") " Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.313194 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf8wb\" (UniqueName: \"kubernetes.io/projected/e576334d-7925-4127-b1b9-2d613614437f-kube-api-access-rf8wb\") pod \"e576334d-7925-4127-b1b9-2d613614437f\" (UID: \"e576334d-7925-4127-b1b9-2d613614437f\") " Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.318996 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e576334d-7925-4127-b1b9-2d613614437f-kube-api-access-rf8wb" (OuterVolumeSpecName: "kube-api-access-rf8wb") pod "e576334d-7925-4127-b1b9-2d613614437f" (UID: "e576334d-7925-4127-b1b9-2d613614437f"). InnerVolumeSpecName "kube-api-access-rf8wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.345566 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e576334d-7925-4127-b1b9-2d613614437f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e576334d-7925-4127-b1b9-2d613614437f" (UID: "e576334d-7925-4127-b1b9-2d613614437f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.346050 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e576334d-7925-4127-b1b9-2d613614437f-inventory" (OuterVolumeSpecName: "inventory") pod "e576334d-7925-4127-b1b9-2d613614437f" (UID: "e576334d-7925-4127-b1b9-2d613614437f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.414810 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e576334d-7925-4127-b1b9-2d613614437f-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.414849 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e576334d-7925-4127-b1b9-2d613614437f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.414862 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf8wb\" (UniqueName: \"kubernetes.io/projected/e576334d-7925-4127-b1b9-2d613614437f-kube-api-access-rf8wb\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.815962 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08b36fbf-6076-4ebf-8d71-6f3f121a9f5f" path="/var/lib/kubelet/pods/08b36fbf-6076-4ebf-8d71-6f3f121a9f5f/volumes" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.849938 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" event={"ID":"e576334d-7925-4127-b1b9-2d613614437f","Type":"ContainerDied","Data":"9bc3281a9ced5eb81f3bf2f1fc8187999497f2a0de17a3d8b06ea8ad1a0209b2"} Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.849974 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bc3281a9ced5eb81f3bf2f1fc8187999497f2a0de17a3d8b06ea8ad1a0209b2" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.850029 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.913864 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26"] Oct 01 13:03:30 crc kubenswrapper[4913]: E1001 13:03:30.914319 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e576334d-7925-4127-b1b9-2d613614437f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.914346 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e576334d-7925-4127-b1b9-2d613614437f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.914626 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="e576334d-7925-4127-b1b9-2d613614437f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.915374 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.918654 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.918682 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.919180 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.919233 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:03:30 crc kubenswrapper[4913]: I1001 13:03:30.923031 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26"] Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.025503 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx49w\" (UniqueName: \"kubernetes.io/projected/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-kube-api-access-gx49w\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d2n26\" (UID: \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.025611 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d2n26\" (UID: \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.025696 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d2n26\" (UID: \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.026648 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-l4t7s"] Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.035823 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-h8szt"] Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.043176 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-l4t7s"] Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.050857 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-h8szt"] Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.127581 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d2n26\" (UID: \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.127664 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d2n26\" (UID: \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.127744 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx49w\" (UniqueName: \"kubernetes.io/projected/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-kube-api-access-gx49w\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d2n26\" (UID: \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.131634 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d2n26\" (UID: \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.137909 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d2n26\" (UID: \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.170932 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx49w\" (UniqueName: \"kubernetes.io/projected/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-kube-api-access-gx49w\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-d2n26\" (UID: \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.229284 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.759165 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26"] Oct 01 13:03:31 crc kubenswrapper[4913]: I1001 13:03:31.859283 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" event={"ID":"38ea86ed-6acc-47b0-b8cf-92a7f2c44428","Type":"ContainerStarted","Data":"2c411265d338921898f8c442f032ee9178260890795a22c61108bc114301e4f3"} Oct 01 13:03:32 crc kubenswrapper[4913]: I1001 13:03:32.817921 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="031164fb-9e5d-42a4-aca9-45ce70c435d7" path="/var/lib/kubelet/pods/031164fb-9e5d-42a4-aca9-45ce70c435d7/volumes" Oct 01 13:03:32 crc kubenswrapper[4913]: I1001 13:03:32.819020 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d" path="/var/lib/kubelet/pods/0a55cd8b-17c4-480f-85f8-2dd8bd1d5e9d/volumes" Oct 01 13:03:32 crc kubenswrapper[4913]: I1001 13:03:32.870400 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" event={"ID":"38ea86ed-6acc-47b0-b8cf-92a7f2c44428","Type":"ContainerStarted","Data":"b24d119dd29c4c04a767344d0a98df004efe8b045040a98d6bc94f69f5ca4f44"} Oct 01 13:03:32 crc kubenswrapper[4913]: I1001 13:03:32.890399 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" podStartSLOduration=2.2392245920000002 podStartE2EDuration="2.890384294s" podCreationTimestamp="2025-10-01 13:03:30 +0000 UTC" firstStartedPulling="2025-10-01 13:03:31.756105431 +0000 UTC m=+1543.659581029" lastFinishedPulling="2025-10-01 13:03:32.407265153 +0000 UTC m=+1544.310740731" observedRunningTime="2025-10-01 13:03:32.887587876 +0000 UTC m=+1544.791063464" watchObservedRunningTime="2025-10-01 13:03:32.890384294 +0000 UTC m=+1544.793859872" Oct 01 13:03:35 crc kubenswrapper[4913]: I1001 13:03:35.042396 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-482r5"] Oct 01 13:03:35 crc kubenswrapper[4913]: I1001 13:03:35.058725 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-482r5"] Oct 01 13:03:36 crc kubenswrapper[4913]: I1001 13:03:36.824101 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736d359f-862f-434b-855d-4e7152a297a3" path="/var/lib/kubelet/pods/736d359f-862f-434b-855d-4e7152a297a3/volumes" Oct 01 13:03:39 crc kubenswrapper[4913]: I1001 13:03:39.032221 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-qslcs"] Oct 01 13:03:39 crc kubenswrapper[4913]: I1001 13:03:39.038705 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-qslcs"] Oct 01 13:03:40 crc kubenswrapper[4913]: I1001 13:03:40.808028 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:03:40 crc kubenswrapper[4913]: E1001 13:03:40.808700 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:03:40 crc kubenswrapper[4913]: I1001 13:03:40.824352 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d726dfa-1763-4ae9-999a-2b58c91ae988" path="/var/lib/kubelet/pods/0d726dfa-1763-4ae9-999a-2b58c91ae988/volumes" Oct 01 13:03:42 crc kubenswrapper[4913]: I1001 13:03:42.033427 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2409-account-create-8xltx"] Oct 01 13:03:42 crc kubenswrapper[4913]: I1001 13:03:42.047619 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-5ae0-account-create-rvj2c"] Oct 01 13:03:42 crc kubenswrapper[4913]: I1001 13:03:42.057152 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2409-account-create-8xltx"] Oct 01 13:03:42 crc kubenswrapper[4913]: I1001 13:03:42.066067 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-5ae0-account-create-rvj2c"] Oct 01 13:03:42 crc kubenswrapper[4913]: I1001 13:03:42.818763 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31c09d0-abac-44cc-9d6d-94c05a99e577" path="/var/lib/kubelet/pods/a31c09d0-abac-44cc-9d6d-94c05a99e577/volumes" Oct 01 13:03:42 crc kubenswrapper[4913]: I1001 13:03:42.820180 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f222cfe0-8537-43b8-a6be-b18bd7bbcaff" path="/var/lib/kubelet/pods/f222cfe0-8537-43b8-a6be-b18bd7bbcaff/volumes" Oct 01 13:03:54 crc kubenswrapper[4913]: I1001 13:03:54.806787 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:03:54 crc kubenswrapper[4913]: E1001 13:03:54.807677 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:04:04 crc kubenswrapper[4913]: I1001 13:04:04.334208 4913 scope.go:117] "RemoveContainer" containerID="c2de77f0dde39eb1b18feac456663932fda6e8cfeff231d0c283c45a50884608" Oct 01 13:04:04 crc kubenswrapper[4913]: I1001 13:04:04.384929 4913 scope.go:117] "RemoveContainer" containerID="12b419c58d516e7add950651ec72776dcc9cb68fe99158a4f5f643d6dcd8bfba" Oct 01 13:04:04 crc kubenswrapper[4913]: I1001 13:04:04.415997 4913 scope.go:117] "RemoveContainer" containerID="9a21abe3da8a5b9eeca15c45c28c09716e7d2662ab769ebdfb3c32ea7e246b3a" Oct 01 13:04:04 crc kubenswrapper[4913]: I1001 13:04:04.459997 4913 scope.go:117] "RemoveContainer" containerID="fa1367da838eb1d9d7ad7f0f3c984cd02576420e126bf863a1071c8477ad0550" Oct 01 13:04:04 crc kubenswrapper[4913]: I1001 13:04:04.500151 4913 scope.go:117] "RemoveContainer" containerID="618aad5f2cb8064047d1555435cf75f5b94e2d99a72c38de318bd12ea5565563" Oct 01 13:04:04 crc kubenswrapper[4913]: I1001 13:04:04.541962 4913 scope.go:117] "RemoveContainer" containerID="bade0a4dacc7ccca93f454ba26c19c88d79bbf33590977279aa5ac8cb8a578d2" Oct 01 13:04:04 crc kubenswrapper[4913]: I1001 13:04:04.584856 4913 scope.go:117] "RemoveContainer" containerID="c587b6ca356b4a0324937018127b81faa7d65a8fc75955af7a271868a5467ebd" Oct 01 13:04:04 crc kubenswrapper[4913]: I1001 13:04:04.609236 4913 scope.go:117] "RemoveContainer" containerID="ea0ce763dd128e8cab6b4cf28b697295ede7e0ba785bf094156e1dc64b216484" Oct 01 13:04:04 crc kubenswrapper[4913]: I1001 13:04:04.672537 4913 scope.go:117] "RemoveContainer" containerID="7772479e635746bd28e22825db2711638f4f5e1dde8a95d8437381ed62339858" Oct 01 13:04:04 crc kubenswrapper[4913]: I1001 13:04:04.692243 4913 scope.go:117] "RemoveContainer" containerID="afc7ec16485816182a90a00e377ec239fb4a1a1c7c8e35bdc014cc43119ce7a0" Oct 01 13:04:04 crc kubenswrapper[4913]: I1001 13:04:04.727653 4913 scope.go:117] "RemoveContainer" containerID="09ff27b0fdd6406176781e588fa5b1df77493dc22e94bd0198e7914f25b45a69" Oct 01 13:04:04 crc kubenswrapper[4913]: I1001 13:04:04.750480 4913 scope.go:117] "RemoveContainer" containerID="418eea09dd911901ce85e19b83e523841077d62e27eb30f64fa56d6d516abbd0" Oct 01 13:04:05 crc kubenswrapper[4913]: I1001 13:04:05.026445 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f41e-account-create-5hkwk"] Oct 01 13:04:05 crc kubenswrapper[4913]: I1001 13:04:05.034885 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f41e-account-create-5hkwk"] Oct 01 13:04:06 crc kubenswrapper[4913]: I1001 13:04:06.029306 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8r5fv"] Oct 01 13:04:06 crc kubenswrapper[4913]: I1001 13:04:06.037583 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8r5fv"] Oct 01 13:04:06 crc kubenswrapper[4913]: I1001 13:04:06.818396 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3523af18-f212-4eeb-8e62-5fabc32a4e6c" path="/var/lib/kubelet/pods/3523af18-f212-4eeb-8e62-5fabc32a4e6c/volumes" Oct 01 13:04:06 crc kubenswrapper[4913]: I1001 13:04:06.819860 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3950662-6b64-4585-8cb2-8c94623a3d66" path="/var/lib/kubelet/pods/b3950662-6b64-4585-8cb2-8c94623a3d66/volumes" Oct 01 13:04:07 crc kubenswrapper[4913]: I1001 13:04:07.807247 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:04:07 crc kubenswrapper[4913]: E1001 13:04:07.807995 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:04:08 crc kubenswrapper[4913]: I1001 13:04:08.214702 4913 generic.go:334] "Generic (PLEG): container finished" podID="38ea86ed-6acc-47b0-b8cf-92a7f2c44428" containerID="b24d119dd29c4c04a767344d0a98df004efe8b045040a98d6bc94f69f5ca4f44" exitCode=0 Oct 01 13:04:08 crc kubenswrapper[4913]: I1001 13:04:08.214784 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" event={"ID":"38ea86ed-6acc-47b0-b8cf-92a7f2c44428","Type":"ContainerDied","Data":"b24d119dd29c4c04a767344d0a98df004efe8b045040a98d6bc94f69f5ca4f44"} Oct 01 13:04:09 crc kubenswrapper[4913]: I1001 13:04:09.620312 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" Oct 01 13:04:09 crc kubenswrapper[4913]: I1001 13:04:09.730443 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-inventory\") pod \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\" (UID: \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\") " Oct 01 13:04:09 crc kubenswrapper[4913]: I1001 13:04:09.730509 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx49w\" (UniqueName: \"kubernetes.io/projected/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-kube-api-access-gx49w\") pod \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\" (UID: \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\") " Oct 01 13:04:09 crc kubenswrapper[4913]: I1001 13:04:09.731033 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-ssh-key\") pod \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\" (UID: \"38ea86ed-6acc-47b0-b8cf-92a7f2c44428\") " Oct 01 13:04:09 crc kubenswrapper[4913]: I1001 13:04:09.737016 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-kube-api-access-gx49w" (OuterVolumeSpecName: "kube-api-access-gx49w") pod "38ea86ed-6acc-47b0-b8cf-92a7f2c44428" (UID: "38ea86ed-6acc-47b0-b8cf-92a7f2c44428"). InnerVolumeSpecName "kube-api-access-gx49w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:04:09 crc kubenswrapper[4913]: I1001 13:04:09.760452 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-inventory" (OuterVolumeSpecName: "inventory") pod "38ea86ed-6acc-47b0-b8cf-92a7f2c44428" (UID: "38ea86ed-6acc-47b0-b8cf-92a7f2c44428"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:04:09 crc kubenswrapper[4913]: I1001 13:04:09.768054 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "38ea86ed-6acc-47b0-b8cf-92a7f2c44428" (UID: "38ea86ed-6acc-47b0-b8cf-92a7f2c44428"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:04:09 crc kubenswrapper[4913]: I1001 13:04:09.836146 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:04:09 crc kubenswrapper[4913]: I1001 13:04:09.836176 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx49w\" (UniqueName: \"kubernetes.io/projected/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-kube-api-access-gx49w\") on node \"crc\" DevicePath \"\"" Oct 01 13:04:09 crc kubenswrapper[4913]: I1001 13:04:09.836187 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38ea86ed-6acc-47b0-b8cf-92a7f2c44428-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.232884 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" event={"ID":"38ea86ed-6acc-47b0-b8cf-92a7f2c44428","Type":"ContainerDied","Data":"2c411265d338921898f8c442f032ee9178260890795a22c61108bc114301e4f3"} Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.232934 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c411265d338921898f8c442f032ee9178260890795a22c61108bc114301e4f3" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.232992 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.312739 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx"] Oct 01 13:04:10 crc kubenswrapper[4913]: E1001 13:04:10.313173 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ea86ed-6acc-47b0-b8cf-92a7f2c44428" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.313196 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ea86ed-6acc-47b0-b8cf-92a7f2c44428" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.313478 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="38ea86ed-6acc-47b0-b8cf-92a7f2c44428" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.314234 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.316991 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.318118 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.318137 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.318177 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.328910 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx"] Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.445107 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx\" (UID: \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.445229 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx\" (UID: \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.445339 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsbrm\" (UniqueName: \"kubernetes.io/projected/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-kube-api-access-wsbrm\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx\" (UID: \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.546801 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx\" (UID: \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.546976 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx\" (UID: \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.547069 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsbrm\" (UniqueName: \"kubernetes.io/projected/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-kube-api-access-wsbrm\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx\" (UID: \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.550833 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx\" (UID: \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.551149 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx\" (UID: \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.572663 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsbrm\" (UniqueName: \"kubernetes.io/projected/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-kube-api-access-wsbrm\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx\" (UID: \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" Oct 01 13:04:10 crc kubenswrapper[4913]: I1001 13:04:10.643302 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" Oct 01 13:04:11 crc kubenswrapper[4913]: I1001 13:04:11.201904 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx"] Oct 01 13:04:11 crc kubenswrapper[4913]: I1001 13:04:11.242439 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" event={"ID":"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591","Type":"ContainerStarted","Data":"29d49838f441edc721dc45c43e5cd6f166efc4bc73325ed6e928120d1cb3e9bb"} Oct 01 13:04:13 crc kubenswrapper[4913]: I1001 13:04:13.271122 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" event={"ID":"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591","Type":"ContainerStarted","Data":"dc058e8d0782c134e52ac03f6d2e26fe129be7c52950ba4a1bd8d5f0bdba7661"} Oct 01 13:04:17 crc kubenswrapper[4913]: I1001 13:04:17.321303 4913 generic.go:334] "Generic (PLEG): container finished" podID="4dd227f3-5b5c-4154-9b86-e0a2c2bfe591" containerID="dc058e8d0782c134e52ac03f6d2e26fe129be7c52950ba4a1bd8d5f0bdba7661" exitCode=0 Oct 01 13:04:17 crc kubenswrapper[4913]: I1001 13:04:17.321393 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" event={"ID":"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591","Type":"ContainerDied","Data":"dc058e8d0782c134e52ac03f6d2e26fe129be7c52950ba4a1bd8d5f0bdba7661"} Oct 01 13:04:18 crc kubenswrapper[4913]: I1001 13:04:18.782177 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" Oct 01 13:04:18 crc kubenswrapper[4913]: I1001 13:04:18.934854 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-inventory\") pod \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\" (UID: \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\") " Oct 01 13:04:18 crc kubenswrapper[4913]: I1001 13:04:18.935091 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsbrm\" (UniqueName: \"kubernetes.io/projected/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-kube-api-access-wsbrm\") pod \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\" (UID: \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\") " Oct 01 13:04:18 crc kubenswrapper[4913]: I1001 13:04:18.935139 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-ssh-key\") pod \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\" (UID: \"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591\") " Oct 01 13:04:18 crc kubenswrapper[4913]: I1001 13:04:18.939947 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-kube-api-access-wsbrm" (OuterVolumeSpecName: "kube-api-access-wsbrm") pod "4dd227f3-5b5c-4154-9b86-e0a2c2bfe591" (UID: "4dd227f3-5b5c-4154-9b86-e0a2c2bfe591"). InnerVolumeSpecName "kube-api-access-wsbrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:04:18 crc kubenswrapper[4913]: I1001 13:04:18.963297 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-inventory" (OuterVolumeSpecName: "inventory") pod "4dd227f3-5b5c-4154-9b86-e0a2c2bfe591" (UID: "4dd227f3-5b5c-4154-9b86-e0a2c2bfe591"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:04:18 crc kubenswrapper[4913]: I1001 13:04:18.964029 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4dd227f3-5b5c-4154-9b86-e0a2c2bfe591" (UID: "4dd227f3-5b5c-4154-9b86-e0a2c2bfe591"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.037406 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsbrm\" (UniqueName: \"kubernetes.io/projected/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-kube-api-access-wsbrm\") on node \"crc\" DevicePath \"\"" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.037437 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.037448 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.045726 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kwlhs"] Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.057818 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kwlhs"] Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.353280 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" event={"ID":"4dd227f3-5b5c-4154-9b86-e0a2c2bfe591","Type":"ContainerDied","Data":"29d49838f441edc721dc45c43e5cd6f166efc4bc73325ed6e928120d1cb3e9bb"} Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.353629 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29d49838f441edc721dc45c43e5cd6f166efc4bc73325ed6e928120d1cb3e9bb" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.353311 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.428839 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk"] Oct 01 13:04:19 crc kubenswrapper[4913]: E1001 13:04:19.429162 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd227f3-5b5c-4154-9b86-e0a2c2bfe591" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.429179 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd227f3-5b5c-4154-9b86-e0a2c2bfe591" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.429405 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd227f3-5b5c-4154-9b86-e0a2c2bfe591" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.429933 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.432521 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.432667 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.433891 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.438304 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.440600 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk"] Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.549108 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcnvw\" (UniqueName: \"kubernetes.io/projected/fb05f7e6-1dd6-4893-8394-8e88e69f141f-kube-api-access-wcnvw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk\" (UID: \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.549370 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb05f7e6-1dd6-4893-8394-8e88e69f141f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk\" (UID: \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.549442 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb05f7e6-1dd6-4893-8394-8e88e69f141f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk\" (UID: \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.650878 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb05f7e6-1dd6-4893-8394-8e88e69f141f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk\" (UID: \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.651002 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb05f7e6-1dd6-4893-8394-8e88e69f141f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk\" (UID: \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.651085 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcnvw\" (UniqueName: \"kubernetes.io/projected/fb05f7e6-1dd6-4893-8394-8e88e69f141f-kube-api-access-wcnvw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk\" (UID: \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.658995 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb05f7e6-1dd6-4893-8394-8e88e69f141f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk\" (UID: \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.659576 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb05f7e6-1dd6-4893-8394-8e88e69f141f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk\" (UID: \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.672159 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcnvw\" (UniqueName: \"kubernetes.io/projected/fb05f7e6-1dd6-4893-8394-8e88e69f141f-kube-api-access-wcnvw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk\" (UID: \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.749161 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" Oct 01 13:04:19 crc kubenswrapper[4913]: I1001 13:04:19.807422 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:04:19 crc kubenswrapper[4913]: E1001 13:04:19.808333 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:04:20 crc kubenswrapper[4913]: I1001 13:04:20.348923 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk"] Oct 01 13:04:20 crc kubenswrapper[4913]: I1001 13:04:20.364940 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" event={"ID":"fb05f7e6-1dd6-4893-8394-8e88e69f141f","Type":"ContainerStarted","Data":"5536a89e7699cad8cb1f2eb30b3edd5024e06469c7a88c2436ec13bd0573c6b4"} Oct 01 13:04:20 crc kubenswrapper[4913]: I1001 13:04:20.818197 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="872e2d84-7827-401f-bf95-60df7954e22e" path="/var/lib/kubelet/pods/872e2d84-7827-401f-bf95-60df7954e22e/volumes" Oct 01 13:04:21 crc kubenswrapper[4913]: I1001 13:04:21.380919 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" event={"ID":"fb05f7e6-1dd6-4893-8394-8e88e69f141f","Type":"ContainerStarted","Data":"ddca2dee667cc931bba0e873e748681b1ad4ff16f4afe97d82d6d2ba2d6afab1"} Oct 01 13:04:21 crc kubenswrapper[4913]: I1001 13:04:21.411468 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" podStartSLOduration=1.948895398 podStartE2EDuration="2.411442688s" podCreationTimestamp="2025-10-01 13:04:19 +0000 UTC" firstStartedPulling="2025-10-01 13:04:20.34930215 +0000 UTC m=+1592.252777738" lastFinishedPulling="2025-10-01 13:04:20.81184945 +0000 UTC m=+1592.715325028" observedRunningTime="2025-10-01 13:04:21.401677179 +0000 UTC m=+1593.305152777" watchObservedRunningTime="2025-10-01 13:04:21.411442688 +0000 UTC m=+1593.314918276" Oct 01 13:04:32 crc kubenswrapper[4913]: I1001 13:04:32.806960 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:04:32 crc kubenswrapper[4913]: E1001 13:04:32.807955 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:04:43 crc kubenswrapper[4913]: I1001 13:04:43.806850 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:04:43 crc kubenswrapper[4913]: E1001 13:04:43.808018 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:04:58 crc kubenswrapper[4913]: I1001 13:04:58.813534 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:04:58 crc kubenswrapper[4913]: E1001 13:04:58.814322 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:05:03 crc kubenswrapper[4913]: I1001 13:05:03.051839 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-x9gmn"] Oct 01 13:05:03 crc kubenswrapper[4913]: I1001 13:05:03.065794 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-x9gmn"] Oct 01 13:05:03 crc kubenswrapper[4913]: I1001 13:05:03.073181 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-qzddd"] Oct 01 13:05:03 crc kubenswrapper[4913]: I1001 13:05:03.079975 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-qzddd"] Oct 01 13:05:04 crc kubenswrapper[4913]: I1001 13:05:04.826196 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f1ef1c4-7a72-4569-b21c-ef13cb766d25" path="/var/lib/kubelet/pods/0f1ef1c4-7a72-4569-b21c-ef13cb766d25/volumes" Oct 01 13:05:04 crc kubenswrapper[4913]: I1001 13:05:04.828146 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="706c5fb0-a691-4f92-bb4e-a6ba720abfa1" path="/var/lib/kubelet/pods/706c5fb0-a691-4f92-bb4e-a6ba720abfa1/volumes" Oct 01 13:05:04 crc kubenswrapper[4913]: I1001 13:05:04.971027 4913 scope.go:117] "RemoveContainer" containerID="c744d3854adb67bd38db6954675af051c3c83181adcd76ae4cf902629db30151" Oct 01 13:05:04 crc kubenswrapper[4913]: I1001 13:05:04.996536 4913 scope.go:117] "RemoveContainer" containerID="abe58ab5ed439f472f92760a23b80bd669796d967e771e2cecc0636a0a3b62a8" Oct 01 13:05:05 crc kubenswrapper[4913]: I1001 13:05:05.078210 4913 scope.go:117] "RemoveContainer" containerID="0733b180aa15fc2fdb00801899e46a4bc69a580d010f4ca862fef2e338a3efbd" Oct 01 13:05:05 crc kubenswrapper[4913]: I1001 13:05:05.130196 4913 scope.go:117] "RemoveContainer" containerID="da08a47649e49b8fb350534810eca2b16320ed6ac6621b509d9fc5bee0170851" Oct 01 13:05:05 crc kubenswrapper[4913]: I1001 13:05:05.171230 4913 scope.go:117] "RemoveContainer" containerID="56600f195a007cdc232b8005819f8754c4b696d5bdab718fbac42ed771d8ffe2" Oct 01 13:05:06 crc kubenswrapper[4913]: I1001 13:05:06.039889 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-g8j5s"] Oct 01 13:05:06 crc kubenswrapper[4913]: I1001 13:05:06.054198 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-bf4s9"] Oct 01 13:05:06 crc kubenswrapper[4913]: I1001 13:05:06.074136 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hkhhg"] Oct 01 13:05:06 crc kubenswrapper[4913]: I1001 13:05:06.086557 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-g8j5s"] Oct 01 13:05:06 crc kubenswrapper[4913]: I1001 13:05:06.099224 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-bf4s9"] Oct 01 13:05:06 crc kubenswrapper[4913]: I1001 13:05:06.109958 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hkhhg"] Oct 01 13:05:06 crc kubenswrapper[4913]: I1001 13:05:06.827804 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fa06e3e-e04d-481f-87e0-a55d168994f7" path="/var/lib/kubelet/pods/3fa06e3e-e04d-481f-87e0-a55d168994f7/volumes" Oct 01 13:05:06 crc kubenswrapper[4913]: I1001 13:05:06.829565 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4983dc8d-2950-45be-9bd3-33f5e24d52ef" path="/var/lib/kubelet/pods/4983dc8d-2950-45be-9bd3-33f5e24d52ef/volumes" Oct 01 13:05:06 crc kubenswrapper[4913]: I1001 13:05:06.831508 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82163264-a7e8-4183-b162-8ddabbce7f39" path="/var/lib/kubelet/pods/82163264-a7e8-4183-b162-8ddabbce7f39/volumes" Oct 01 13:05:10 crc kubenswrapper[4913]: I1001 13:05:10.034397 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-b6l67"] Oct 01 13:05:10 crc kubenswrapper[4913]: I1001 13:05:10.041017 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-b6l67"] Oct 01 13:05:10 crc kubenswrapper[4913]: I1001 13:05:10.830575 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57296118-560c-4764-b94a-472d8467f7c0" path="/var/lib/kubelet/pods/57296118-560c-4764-b94a-472d8467f7c0/volumes" Oct 01 13:05:13 crc kubenswrapper[4913]: I1001 13:05:13.807189 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:05:13 crc kubenswrapper[4913]: E1001 13:05:13.808014 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:05:16 crc kubenswrapper[4913]: I1001 13:05:16.035799 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-70c3-account-create-9wk5c"] Oct 01 13:05:16 crc kubenswrapper[4913]: I1001 13:05:16.044187 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-70c3-account-create-9wk5c"] Oct 01 13:05:16 crc kubenswrapper[4913]: I1001 13:05:16.051051 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-55b9-account-create-v4tbp"] Oct 01 13:05:16 crc kubenswrapper[4913]: I1001 13:05:16.057436 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-55b9-account-create-v4tbp"] Oct 01 13:05:16 crc kubenswrapper[4913]: I1001 13:05:16.819302 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2e6e788-d705-40e2-b19e-23d915ccc7cd" path="/var/lib/kubelet/pods/c2e6e788-d705-40e2-b19e-23d915ccc7cd/volumes" Oct 01 13:05:16 crc kubenswrapper[4913]: I1001 13:05:16.820049 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd92064d-f7ca-4b7c-8596-ccc759c048ad" path="/var/lib/kubelet/pods/fd92064d-f7ca-4b7c-8596-ccc759c048ad/volumes" Oct 01 13:05:17 crc kubenswrapper[4913]: I1001 13:05:17.041498 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-szknr"] Oct 01 13:05:17 crc kubenswrapper[4913]: I1001 13:05:17.058716 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-szknr"] Oct 01 13:05:17 crc kubenswrapper[4913]: I1001 13:05:17.968467 4913 generic.go:334] "Generic (PLEG): container finished" podID="fb05f7e6-1dd6-4893-8394-8e88e69f141f" containerID="ddca2dee667cc931bba0e873e748681b1ad4ff16f4afe97d82d6d2ba2d6afab1" exitCode=2 Oct 01 13:05:17 crc kubenswrapper[4913]: I1001 13:05:17.968819 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" event={"ID":"fb05f7e6-1dd6-4893-8394-8e88e69f141f","Type":"ContainerDied","Data":"ddca2dee667cc931bba0e873e748681b1ad4ff16f4afe97d82d6d2ba2d6afab1"} Oct 01 13:05:18 crc kubenswrapper[4913]: I1001 13:05:18.821569 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b038340-cef3-419a-a1e2-2aa46a7f3ee6" path="/var/lib/kubelet/pods/3b038340-cef3-419a-a1e2-2aa46a7f3ee6/volumes" Oct 01 13:05:19 crc kubenswrapper[4913]: I1001 13:05:19.425457 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" Oct 01 13:05:19 crc kubenswrapper[4913]: I1001 13:05:19.558449 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcnvw\" (UniqueName: \"kubernetes.io/projected/fb05f7e6-1dd6-4893-8394-8e88e69f141f-kube-api-access-wcnvw\") pod \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\" (UID: \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\") " Oct 01 13:05:19 crc kubenswrapper[4913]: I1001 13:05:19.558585 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb05f7e6-1dd6-4893-8394-8e88e69f141f-inventory\") pod \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\" (UID: \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\") " Oct 01 13:05:19 crc kubenswrapper[4913]: I1001 13:05:19.558664 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb05f7e6-1dd6-4893-8394-8e88e69f141f-ssh-key\") pod \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\" (UID: \"fb05f7e6-1dd6-4893-8394-8e88e69f141f\") " Oct 01 13:05:19 crc kubenswrapper[4913]: I1001 13:05:19.565953 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb05f7e6-1dd6-4893-8394-8e88e69f141f-kube-api-access-wcnvw" (OuterVolumeSpecName: "kube-api-access-wcnvw") pod "fb05f7e6-1dd6-4893-8394-8e88e69f141f" (UID: "fb05f7e6-1dd6-4893-8394-8e88e69f141f"). InnerVolumeSpecName "kube-api-access-wcnvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:05:19 crc kubenswrapper[4913]: I1001 13:05:19.599128 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb05f7e6-1dd6-4893-8394-8e88e69f141f-inventory" (OuterVolumeSpecName: "inventory") pod "fb05f7e6-1dd6-4893-8394-8e88e69f141f" (UID: "fb05f7e6-1dd6-4893-8394-8e88e69f141f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:05:19 crc kubenswrapper[4913]: I1001 13:05:19.605521 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb05f7e6-1dd6-4893-8394-8e88e69f141f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fb05f7e6-1dd6-4893-8394-8e88e69f141f" (UID: "fb05f7e6-1dd6-4893-8394-8e88e69f141f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:05:19 crc kubenswrapper[4913]: I1001 13:05:19.661131 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcnvw\" (UniqueName: \"kubernetes.io/projected/fb05f7e6-1dd6-4893-8394-8e88e69f141f-kube-api-access-wcnvw\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:19 crc kubenswrapper[4913]: I1001 13:05:19.661192 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb05f7e6-1dd6-4893-8394-8e88e69f141f-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:19 crc kubenswrapper[4913]: I1001 13:05:19.661206 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb05f7e6-1dd6-4893-8394-8e88e69f141f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:19 crc kubenswrapper[4913]: I1001 13:05:19.991614 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" event={"ID":"fb05f7e6-1dd6-4893-8394-8e88e69f141f","Type":"ContainerDied","Data":"5536a89e7699cad8cb1f2eb30b3edd5024e06469c7a88c2436ec13bd0573c6b4"} Oct 01 13:05:19 crc kubenswrapper[4913]: I1001 13:05:19.991667 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5536a89e7699cad8cb1f2eb30b3edd5024e06469c7a88c2436ec13bd0573c6b4" Oct 01 13:05:19 crc kubenswrapper[4913]: I1001 13:05:19.991735 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk" Oct 01 13:05:25 crc kubenswrapper[4913]: I1001 13:05:25.025826 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c0ec-account-create-tz4m8"] Oct 01 13:05:25 crc kubenswrapper[4913]: I1001 13:05:25.036352 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c0ec-account-create-tz4m8"] Oct 01 13:05:25 crc kubenswrapper[4913]: I1001 13:05:25.807145 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:05:25 crc kubenswrapper[4913]: E1001 13:05:25.807617 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:05:26 crc kubenswrapper[4913]: I1001 13:05:26.818993 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="639c9d26-ffcf-4cd4-989c-de3f777ec5ea" path="/var/lib/kubelet/pods/639c9d26-ffcf-4cd4-989c-de3f777ec5ea/volumes" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.036374 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7"] Oct 01 13:05:27 crc kubenswrapper[4913]: E1001 13:05:27.037319 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb05f7e6-1dd6-4893-8394-8e88e69f141f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.037711 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb05f7e6-1dd6-4893-8394-8e88e69f141f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.038478 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb05f7e6-1dd6-4893-8394-8e88e69f141f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.039815 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.043057 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.044743 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.046481 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.047897 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7"] Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.049570 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.112061 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57llg\" (UniqueName: \"kubernetes.io/projected/a50a246d-1d3c-4764-aab3-3615be090051-kube-api-access-57llg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7\" (UID: \"a50a246d-1d3c-4764-aab3-3615be090051\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.112253 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a50a246d-1d3c-4764-aab3-3615be090051-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7\" (UID: \"a50a246d-1d3c-4764-aab3-3615be090051\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.112362 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a50a246d-1d3c-4764-aab3-3615be090051-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7\" (UID: \"a50a246d-1d3c-4764-aab3-3615be090051\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.213743 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57llg\" (UniqueName: \"kubernetes.io/projected/a50a246d-1d3c-4764-aab3-3615be090051-kube-api-access-57llg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7\" (UID: \"a50a246d-1d3c-4764-aab3-3615be090051\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.213837 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a50a246d-1d3c-4764-aab3-3615be090051-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7\" (UID: \"a50a246d-1d3c-4764-aab3-3615be090051\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.213873 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a50a246d-1d3c-4764-aab3-3615be090051-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7\" (UID: \"a50a246d-1d3c-4764-aab3-3615be090051\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.223245 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a50a246d-1d3c-4764-aab3-3615be090051-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7\" (UID: \"a50a246d-1d3c-4764-aab3-3615be090051\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.224389 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a50a246d-1d3c-4764-aab3-3615be090051-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7\" (UID: \"a50a246d-1d3c-4764-aab3-3615be090051\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.233688 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57llg\" (UniqueName: \"kubernetes.io/projected/a50a246d-1d3c-4764-aab3-3615be090051-kube-api-access-57llg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7\" (UID: \"a50a246d-1d3c-4764-aab3-3615be090051\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.369608 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" Oct 01 13:05:27 crc kubenswrapper[4913]: I1001 13:05:27.926771 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7"] Oct 01 13:05:28 crc kubenswrapper[4913]: I1001 13:05:28.079006 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" event={"ID":"a50a246d-1d3c-4764-aab3-3615be090051","Type":"ContainerStarted","Data":"c8c5224bffa7dad4184193c74015614295a6af92450f4ae6977047ca31444c08"} Oct 01 13:05:30 crc kubenswrapper[4913]: I1001 13:05:30.108957 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" event={"ID":"a50a246d-1d3c-4764-aab3-3615be090051","Type":"ContainerStarted","Data":"5c32704683e7da980f7b5fbab4a2da43715e012432346c1b88443a5f1bce7c2f"} Oct 01 13:05:30 crc kubenswrapper[4913]: I1001 13:05:30.149361 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" podStartSLOduration=2.253631344 podStartE2EDuration="3.149332274s" podCreationTimestamp="2025-10-01 13:05:27 +0000 UTC" firstStartedPulling="2025-10-01 13:05:27.946895353 +0000 UTC m=+1659.850370951" lastFinishedPulling="2025-10-01 13:05:28.842596273 +0000 UTC m=+1660.746071881" observedRunningTime="2025-10-01 13:05:30.131638456 +0000 UTC m=+1662.035114034" watchObservedRunningTime="2025-10-01 13:05:30.149332274 +0000 UTC m=+1662.052807882" Oct 01 13:05:37 crc kubenswrapper[4913]: I1001 13:05:37.807239 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:05:37 crc kubenswrapper[4913]: E1001 13:05:37.808206 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:05:44 crc kubenswrapper[4913]: I1001 13:05:44.066512 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t5ghk"] Oct 01 13:05:44 crc kubenswrapper[4913]: I1001 13:05:44.079705 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t5ghk"] Oct 01 13:05:44 crc kubenswrapper[4913]: I1001 13:05:44.825569 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba0e4f52-1a20-4443-98c4-03620eec847f" path="/var/lib/kubelet/pods/ba0e4f52-1a20-4443-98c4-03620eec847f/volumes" Oct 01 13:05:49 crc kubenswrapper[4913]: I1001 13:05:49.806823 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:05:49 crc kubenswrapper[4913]: E1001 13:05:49.807659 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:06:04 crc kubenswrapper[4913]: I1001 13:06:04.042114 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-8gshc"] Oct 01 13:06:04 crc kubenswrapper[4913]: I1001 13:06:04.056816 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-8gshc"] Oct 01 13:06:04 crc kubenswrapper[4913]: I1001 13:06:04.806939 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:06:04 crc kubenswrapper[4913]: E1001 13:06:04.807652 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:06:04 crc kubenswrapper[4913]: I1001 13:06:04.822931 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c41052-7191-462a-9627-9a2fbe9206b3" path="/var/lib/kubelet/pods/36c41052-7191-462a-9627-9a2fbe9206b3/volumes" Oct 01 13:06:05 crc kubenswrapper[4913]: I1001 13:06:05.328436 4913 scope.go:117] "RemoveContainer" containerID="ba6de2ffb7c203f7458dff486130e95a87d07f1222801f2e843cb513a1548243" Oct 01 13:06:05 crc kubenswrapper[4913]: I1001 13:06:05.350640 4913 scope.go:117] "RemoveContainer" containerID="35811603ce86b87b16e3fd29bfca2c735f17a20dbdeab6d793181e40bcb4ef63" Oct 01 13:06:05 crc kubenswrapper[4913]: I1001 13:06:05.393726 4913 scope.go:117] "RemoveContainer" containerID="ff54338cd5d7745e0ab1b98057764d1aec54032102b8ec54ceb2241b8c876f42" Oct 01 13:06:05 crc kubenswrapper[4913]: I1001 13:06:05.430390 4913 scope.go:117] "RemoveContainer" containerID="c868668574f49e75abdcb7f6f977a2e4d3d3d10bfbc90936e59801d010e6d903" Oct 01 13:06:05 crc kubenswrapper[4913]: I1001 13:06:05.496176 4913 scope.go:117] "RemoveContainer" containerID="72b17a12133d5f02c9052daa557d3aaf68bbdf8173d07b3ace9edd03f049c752" Oct 01 13:06:05 crc kubenswrapper[4913]: I1001 13:06:05.545638 4913 scope.go:117] "RemoveContainer" containerID="0bfeade2f85027bd1c26bb56a1d886b3f1f9639a4cead38dcefcbd63b3cfd1b8" Oct 01 13:06:05 crc kubenswrapper[4913]: I1001 13:06:05.572175 4913 scope.go:117] "RemoveContainer" containerID="6e2d93f6b1e95834d602362606c94652fbad38050142bcb478336afa63092c4e" Oct 01 13:06:05 crc kubenswrapper[4913]: I1001 13:06:05.606105 4913 scope.go:117] "RemoveContainer" containerID="527e1694afd53727f343962c8ca2f8b91a5e17ef446bce052be1a32fa1bc7524" Oct 01 13:06:05 crc kubenswrapper[4913]: I1001 13:06:05.647167 4913 scope.go:117] "RemoveContainer" containerID="e35a032a728caed1c5f4271a94daf7c620a76e613f314cf9036f7111f731420f" Oct 01 13:06:05 crc kubenswrapper[4913]: I1001 13:06:05.667715 4913 scope.go:117] "RemoveContainer" containerID="327d699d3496bbcf6a5582b1e50a0ced444acf8fbf87978a9f96445422ea93b3" Oct 01 13:06:16 crc kubenswrapper[4913]: I1001 13:06:16.046457 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qz2dw"] Oct 01 13:06:16 crc kubenswrapper[4913]: I1001 13:06:16.055321 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qz2dw"] Oct 01 13:06:16 crc kubenswrapper[4913]: I1001 13:06:16.833004 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06000dd-9a73-4695-a477-0f361c61cf57" path="/var/lib/kubelet/pods/d06000dd-9a73-4695-a477-0f361c61cf57/volumes" Oct 01 13:06:17 crc kubenswrapper[4913]: I1001 13:06:17.806879 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:06:17 crc kubenswrapper[4913]: E1001 13:06:17.807114 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:06:18 crc kubenswrapper[4913]: I1001 13:06:18.628502 4913 generic.go:334] "Generic (PLEG): container finished" podID="a50a246d-1d3c-4764-aab3-3615be090051" containerID="5c32704683e7da980f7b5fbab4a2da43715e012432346c1b88443a5f1bce7c2f" exitCode=0 Oct 01 13:06:18 crc kubenswrapper[4913]: I1001 13:06:18.628594 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" event={"ID":"a50a246d-1d3c-4764-aab3-3615be090051","Type":"ContainerDied","Data":"5c32704683e7da980f7b5fbab4a2da43715e012432346c1b88443a5f1bce7c2f"} Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.043025 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.177411 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a50a246d-1d3c-4764-aab3-3615be090051-inventory\") pod \"a50a246d-1d3c-4764-aab3-3615be090051\" (UID: \"a50a246d-1d3c-4764-aab3-3615be090051\") " Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.177608 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a50a246d-1d3c-4764-aab3-3615be090051-ssh-key\") pod \"a50a246d-1d3c-4764-aab3-3615be090051\" (UID: \"a50a246d-1d3c-4764-aab3-3615be090051\") " Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.177883 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57llg\" (UniqueName: \"kubernetes.io/projected/a50a246d-1d3c-4764-aab3-3615be090051-kube-api-access-57llg\") pod \"a50a246d-1d3c-4764-aab3-3615be090051\" (UID: \"a50a246d-1d3c-4764-aab3-3615be090051\") " Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.188605 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a50a246d-1d3c-4764-aab3-3615be090051-kube-api-access-57llg" (OuterVolumeSpecName: "kube-api-access-57llg") pod "a50a246d-1d3c-4764-aab3-3615be090051" (UID: "a50a246d-1d3c-4764-aab3-3615be090051"). InnerVolumeSpecName "kube-api-access-57llg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.203579 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a50a246d-1d3c-4764-aab3-3615be090051-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a50a246d-1d3c-4764-aab3-3615be090051" (UID: "a50a246d-1d3c-4764-aab3-3615be090051"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.225610 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a50a246d-1d3c-4764-aab3-3615be090051-inventory" (OuterVolumeSpecName: "inventory") pod "a50a246d-1d3c-4764-aab3-3615be090051" (UID: "a50a246d-1d3c-4764-aab3-3615be090051"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.281237 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a50a246d-1d3c-4764-aab3-3615be090051-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.281635 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a50a246d-1d3c-4764-aab3-3615be090051-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.281646 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57llg\" (UniqueName: \"kubernetes.io/projected/a50a246d-1d3c-4764-aab3-3615be090051-kube-api-access-57llg\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.670524 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" event={"ID":"a50a246d-1d3c-4764-aab3-3615be090051","Type":"ContainerDied","Data":"c8c5224bffa7dad4184193c74015614295a6af92450f4ae6977047ca31444c08"} Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.670585 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8c5224bffa7dad4184193c74015614295a6af92450f4ae6977047ca31444c08" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.670662 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.742981 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-56g4q"] Oct 01 13:06:20 crc kubenswrapper[4913]: E1001 13:06:20.743669 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a50a246d-1d3c-4764-aab3-3615be090051" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.743690 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="a50a246d-1d3c-4764-aab3-3615be090051" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.743951 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="a50a246d-1d3c-4764-aab3-3615be090051" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.744870 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.746763 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.747458 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.747479 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.747531 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.764415 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-56g4q"] Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.894377 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bed26054-0c81-4aa3-93e9-8ad2f2200f86-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-56g4q\" (UID: \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\") " pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.894508 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bed26054-0c81-4aa3-93e9-8ad2f2200f86-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-56g4q\" (UID: \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\") " pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.894730 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzqz5\" (UniqueName: \"kubernetes.io/projected/bed26054-0c81-4aa3-93e9-8ad2f2200f86-kube-api-access-rzqz5\") pod \"ssh-known-hosts-edpm-deployment-56g4q\" (UID: \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\") " pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.996948 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bed26054-0c81-4aa3-93e9-8ad2f2200f86-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-56g4q\" (UID: \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\") " pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.997004 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bed26054-0c81-4aa3-93e9-8ad2f2200f86-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-56g4q\" (UID: \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\") " pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" Oct 01 13:06:20 crc kubenswrapper[4913]: I1001 13:06:20.997076 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzqz5\" (UniqueName: \"kubernetes.io/projected/bed26054-0c81-4aa3-93e9-8ad2f2200f86-kube-api-access-rzqz5\") pod \"ssh-known-hosts-edpm-deployment-56g4q\" (UID: \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\") " pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" Oct 01 13:06:21 crc kubenswrapper[4913]: I1001 13:06:21.003711 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bed26054-0c81-4aa3-93e9-8ad2f2200f86-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-56g4q\" (UID: \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\") " pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" Oct 01 13:06:21 crc kubenswrapper[4913]: I1001 13:06:21.016202 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bed26054-0c81-4aa3-93e9-8ad2f2200f86-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-56g4q\" (UID: \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\") " pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" Oct 01 13:06:21 crc kubenswrapper[4913]: I1001 13:06:21.028244 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzqz5\" (UniqueName: \"kubernetes.io/projected/bed26054-0c81-4aa3-93e9-8ad2f2200f86-kube-api-access-rzqz5\") pod \"ssh-known-hosts-edpm-deployment-56g4q\" (UID: \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\") " pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" Oct 01 13:06:21 crc kubenswrapper[4913]: I1001 13:06:21.065555 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" Oct 01 13:06:21 crc kubenswrapper[4913]: I1001 13:06:21.590334 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-56g4q"] Oct 01 13:06:21 crc kubenswrapper[4913]: I1001 13:06:21.681999 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" event={"ID":"bed26054-0c81-4aa3-93e9-8ad2f2200f86","Type":"ContainerStarted","Data":"02227b8c0d4751bcb34feb0ced45f266bc8a358cda8c8f77db76edd99eb5c216"} Oct 01 13:06:22 crc kubenswrapper[4913]: I1001 13:06:22.691788 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" event={"ID":"bed26054-0c81-4aa3-93e9-8ad2f2200f86","Type":"ContainerStarted","Data":"211f75c1df34f329425c0137b956ac558ca4d1be71e75c8dd45e9e08679a880d"} Oct 01 13:06:22 crc kubenswrapper[4913]: I1001 13:06:22.710745 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" podStartSLOduration=2.024445906 podStartE2EDuration="2.710697802s" podCreationTimestamp="2025-10-01 13:06:20 +0000 UTC" firstStartedPulling="2025-10-01 13:06:21.601472726 +0000 UTC m=+1713.504948324" lastFinishedPulling="2025-10-01 13:06:22.287724642 +0000 UTC m=+1714.191200220" observedRunningTime="2025-10-01 13:06:22.7077443 +0000 UTC m=+1714.611219908" watchObservedRunningTime="2025-10-01 13:06:22.710697802 +0000 UTC m=+1714.614173390" Oct 01 13:06:28 crc kubenswrapper[4913]: I1001 13:06:28.814546 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:06:28 crc kubenswrapper[4913]: E1001 13:06:28.815724 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:06:29 crc kubenswrapper[4913]: I1001 13:06:29.757385 4913 generic.go:334] "Generic (PLEG): container finished" podID="bed26054-0c81-4aa3-93e9-8ad2f2200f86" containerID="211f75c1df34f329425c0137b956ac558ca4d1be71e75c8dd45e9e08679a880d" exitCode=0 Oct 01 13:06:29 crc kubenswrapper[4913]: I1001 13:06:29.757509 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" event={"ID":"bed26054-0c81-4aa3-93e9-8ad2f2200f86","Type":"ContainerDied","Data":"211f75c1df34f329425c0137b956ac558ca4d1be71e75c8dd45e9e08679a880d"} Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.202161 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.312041 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bed26054-0c81-4aa3-93e9-8ad2f2200f86-ssh-key-openstack-edpm-ipam\") pod \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\" (UID: \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\") " Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.312222 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzqz5\" (UniqueName: \"kubernetes.io/projected/bed26054-0c81-4aa3-93e9-8ad2f2200f86-kube-api-access-rzqz5\") pod \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\" (UID: \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\") " Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.312320 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bed26054-0c81-4aa3-93e9-8ad2f2200f86-inventory-0\") pod \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\" (UID: \"bed26054-0c81-4aa3-93e9-8ad2f2200f86\") " Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.321773 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bed26054-0c81-4aa3-93e9-8ad2f2200f86-kube-api-access-rzqz5" (OuterVolumeSpecName: "kube-api-access-rzqz5") pod "bed26054-0c81-4aa3-93e9-8ad2f2200f86" (UID: "bed26054-0c81-4aa3-93e9-8ad2f2200f86"). InnerVolumeSpecName "kube-api-access-rzqz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.354513 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed26054-0c81-4aa3-93e9-8ad2f2200f86-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bed26054-0c81-4aa3-93e9-8ad2f2200f86" (UID: "bed26054-0c81-4aa3-93e9-8ad2f2200f86"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.354883 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed26054-0c81-4aa3-93e9-8ad2f2200f86-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "bed26054-0c81-4aa3-93e9-8ad2f2200f86" (UID: "bed26054-0c81-4aa3-93e9-8ad2f2200f86"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.415637 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bed26054-0c81-4aa3-93e9-8ad2f2200f86-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.415662 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzqz5\" (UniqueName: \"kubernetes.io/projected/bed26054-0c81-4aa3-93e9-8ad2f2200f86-kube-api-access-rzqz5\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.415673 4913 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bed26054-0c81-4aa3-93e9-8ad2f2200f86-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.776617 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" event={"ID":"bed26054-0c81-4aa3-93e9-8ad2f2200f86","Type":"ContainerDied","Data":"02227b8c0d4751bcb34feb0ced45f266bc8a358cda8c8f77db76edd99eb5c216"} Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.776658 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02227b8c0d4751bcb34feb0ced45f266bc8a358cda8c8f77db76edd99eb5c216" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.776707 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-56g4q" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.861702 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr"] Oct 01 13:06:31 crc kubenswrapper[4913]: E1001 13:06:31.862081 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed26054-0c81-4aa3-93e9-8ad2f2200f86" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.862102 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed26054-0c81-4aa3-93e9-8ad2f2200f86" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.862365 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="bed26054-0c81-4aa3-93e9-8ad2f2200f86" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.863023 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.866873 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.866883 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.867148 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.879557 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr"] Oct 01 13:06:31 crc kubenswrapper[4913]: I1001 13:06:31.879730 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:06:32 crc kubenswrapper[4913]: I1001 13:06:32.027251 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxs8q\" (UniqueName: \"kubernetes.io/projected/03d5a3d9-1379-4064-86b7-8f1dc11729a4-kube-api-access-qxs8q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pwrjr\" (UID: \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" Oct 01 13:06:32 crc kubenswrapper[4913]: I1001 13:06:32.027501 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03d5a3d9-1379-4064-86b7-8f1dc11729a4-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pwrjr\" (UID: \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" Oct 01 13:06:32 crc kubenswrapper[4913]: I1001 13:06:32.027834 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03d5a3d9-1379-4064-86b7-8f1dc11729a4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pwrjr\" (UID: \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" Oct 01 13:06:32 crc kubenswrapper[4913]: I1001 13:06:32.129440 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03d5a3d9-1379-4064-86b7-8f1dc11729a4-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pwrjr\" (UID: \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" Oct 01 13:06:32 crc kubenswrapper[4913]: I1001 13:06:32.129635 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03d5a3d9-1379-4064-86b7-8f1dc11729a4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pwrjr\" (UID: \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" Oct 01 13:06:32 crc kubenswrapper[4913]: I1001 13:06:32.129721 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxs8q\" (UniqueName: \"kubernetes.io/projected/03d5a3d9-1379-4064-86b7-8f1dc11729a4-kube-api-access-qxs8q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pwrjr\" (UID: \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" Oct 01 13:06:32 crc kubenswrapper[4913]: I1001 13:06:32.134782 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03d5a3d9-1379-4064-86b7-8f1dc11729a4-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pwrjr\" (UID: \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" Oct 01 13:06:32 crc kubenswrapper[4913]: I1001 13:06:32.135165 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03d5a3d9-1379-4064-86b7-8f1dc11729a4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pwrjr\" (UID: \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" Oct 01 13:06:32 crc kubenswrapper[4913]: I1001 13:06:32.163133 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxs8q\" (UniqueName: \"kubernetes.io/projected/03d5a3d9-1379-4064-86b7-8f1dc11729a4-kube-api-access-qxs8q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pwrjr\" (UID: \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" Oct 01 13:06:32 crc kubenswrapper[4913]: I1001 13:06:32.180338 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" Oct 01 13:06:32 crc kubenswrapper[4913]: I1001 13:06:32.676032 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr"] Oct 01 13:06:32 crc kubenswrapper[4913]: W1001 13:06:32.686567 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03d5a3d9_1379_4064_86b7_8f1dc11729a4.slice/crio-bc6e9947100bc92fbf724f838dce267208edb062a332bc3bee2c8fa17a81ef33 WatchSource:0}: Error finding container bc6e9947100bc92fbf724f838dce267208edb062a332bc3bee2c8fa17a81ef33: Status 404 returned error can't find the container with id bc6e9947100bc92fbf724f838dce267208edb062a332bc3bee2c8fa17a81ef33 Oct 01 13:06:32 crc kubenswrapper[4913]: I1001 13:06:32.787877 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" event={"ID":"03d5a3d9-1379-4064-86b7-8f1dc11729a4","Type":"ContainerStarted","Data":"bc6e9947100bc92fbf724f838dce267208edb062a332bc3bee2c8fa17a81ef33"} Oct 01 13:06:33 crc kubenswrapper[4913]: I1001 13:06:33.797056 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" event={"ID":"03d5a3d9-1379-4064-86b7-8f1dc11729a4","Type":"ContainerStarted","Data":"aec210feb18a07e0b7f6a803740ba7a7b79e94feb989ce02534c98188cf4b9a4"} Oct 01 13:06:33 crc kubenswrapper[4913]: I1001 13:06:33.818655 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" podStartSLOduration=2.08700446 podStartE2EDuration="2.818635307s" podCreationTimestamp="2025-10-01 13:06:31 +0000 UTC" firstStartedPulling="2025-10-01 13:06:32.69012444 +0000 UTC m=+1724.593600018" lastFinishedPulling="2025-10-01 13:06:33.421755287 +0000 UTC m=+1725.325230865" observedRunningTime="2025-10-01 13:06:33.812106537 +0000 UTC m=+1725.715582125" watchObservedRunningTime="2025-10-01 13:06:33.818635307 +0000 UTC m=+1725.722110885" Oct 01 13:06:40 crc kubenswrapper[4913]: I1001 13:06:40.807643 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:06:40 crc kubenswrapper[4913]: E1001 13:06:40.809352 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:06:41 crc kubenswrapper[4913]: I1001 13:06:41.894145 4913 generic.go:334] "Generic (PLEG): container finished" podID="03d5a3d9-1379-4064-86b7-8f1dc11729a4" containerID="aec210feb18a07e0b7f6a803740ba7a7b79e94feb989ce02534c98188cf4b9a4" exitCode=0 Oct 01 13:06:41 crc kubenswrapper[4913]: I1001 13:06:41.894301 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" event={"ID":"03d5a3d9-1379-4064-86b7-8f1dc11729a4","Type":"ContainerDied","Data":"aec210feb18a07e0b7f6a803740ba7a7b79e94feb989ce02534c98188cf4b9a4"} Oct 01 13:06:43 crc kubenswrapper[4913]: I1001 13:06:43.356001 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" Oct 01 13:06:43 crc kubenswrapper[4913]: I1001 13:06:43.356868 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03d5a3d9-1379-4064-86b7-8f1dc11729a4-ssh-key\") pod \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\" (UID: \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\") " Oct 01 13:06:43 crc kubenswrapper[4913]: I1001 13:06:43.401033 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d5a3d9-1379-4064-86b7-8f1dc11729a4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "03d5a3d9-1379-4064-86b7-8f1dc11729a4" (UID: "03d5a3d9-1379-4064-86b7-8f1dc11729a4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:43 crc kubenswrapper[4913]: I1001 13:06:43.457987 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxs8q\" (UniqueName: \"kubernetes.io/projected/03d5a3d9-1379-4064-86b7-8f1dc11729a4-kube-api-access-qxs8q\") pod \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\" (UID: \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\") " Oct 01 13:06:43 crc kubenswrapper[4913]: I1001 13:06:43.458299 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03d5a3d9-1379-4064-86b7-8f1dc11729a4-inventory\") pod \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\" (UID: \"03d5a3d9-1379-4064-86b7-8f1dc11729a4\") " Oct 01 13:06:43 crc kubenswrapper[4913]: I1001 13:06:43.458756 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03d5a3d9-1379-4064-86b7-8f1dc11729a4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:43 crc kubenswrapper[4913]: I1001 13:06:43.460757 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d5a3d9-1379-4064-86b7-8f1dc11729a4-kube-api-access-qxs8q" (OuterVolumeSpecName: "kube-api-access-qxs8q") pod "03d5a3d9-1379-4064-86b7-8f1dc11729a4" (UID: "03d5a3d9-1379-4064-86b7-8f1dc11729a4"). InnerVolumeSpecName "kube-api-access-qxs8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:43 crc kubenswrapper[4913]: I1001 13:06:43.478974 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d5a3d9-1379-4064-86b7-8f1dc11729a4-inventory" (OuterVolumeSpecName: "inventory") pod "03d5a3d9-1379-4064-86b7-8f1dc11729a4" (UID: "03d5a3d9-1379-4064-86b7-8f1dc11729a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:43 crc kubenswrapper[4913]: I1001 13:06:43.560816 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03d5a3d9-1379-4064-86b7-8f1dc11729a4-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:43 crc kubenswrapper[4913]: I1001 13:06:43.560850 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxs8q\" (UniqueName: \"kubernetes.io/projected/03d5a3d9-1379-4064-86b7-8f1dc11729a4-kube-api-access-qxs8q\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:43 crc kubenswrapper[4913]: I1001 13:06:43.921464 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" event={"ID":"03d5a3d9-1379-4064-86b7-8f1dc11729a4","Type":"ContainerDied","Data":"bc6e9947100bc92fbf724f838dce267208edb062a332bc3bee2c8fa17a81ef33"} Oct 01 13:06:43 crc kubenswrapper[4913]: I1001 13:06:43.922063 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc6e9947100bc92fbf724f838dce267208edb062a332bc3bee2c8fa17a81ef33" Oct 01 13:06:43 crc kubenswrapper[4913]: I1001 13:06:43.921586 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.007159 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr"] Oct 01 13:06:44 crc kubenswrapper[4913]: E1001 13:06:44.007530 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d5a3d9-1379-4064-86b7-8f1dc11729a4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.007549 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d5a3d9-1379-4064-86b7-8f1dc11729a4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.007756 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d5a3d9-1379-4064-86b7-8f1dc11729a4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.008382 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.011201 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.011436 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.012326 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.012600 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.034125 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr"] Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.072110 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8287670d-b297-4845-9020-be42f33f82f2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr\" (UID: \"8287670d-b297-4845-9020-be42f33f82f2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.073507 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rw7g\" (UniqueName: \"kubernetes.io/projected/8287670d-b297-4845-9020-be42f33f82f2-kube-api-access-8rw7g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr\" (UID: \"8287670d-b297-4845-9020-be42f33f82f2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.073695 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8287670d-b297-4845-9020-be42f33f82f2-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr\" (UID: \"8287670d-b297-4845-9020-be42f33f82f2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.175485 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rw7g\" (UniqueName: \"kubernetes.io/projected/8287670d-b297-4845-9020-be42f33f82f2-kube-api-access-8rw7g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr\" (UID: \"8287670d-b297-4845-9020-be42f33f82f2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.175707 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8287670d-b297-4845-9020-be42f33f82f2-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr\" (UID: \"8287670d-b297-4845-9020-be42f33f82f2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.175794 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8287670d-b297-4845-9020-be42f33f82f2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr\" (UID: \"8287670d-b297-4845-9020-be42f33f82f2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.180505 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8287670d-b297-4845-9020-be42f33f82f2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr\" (UID: \"8287670d-b297-4845-9020-be42f33f82f2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.181040 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8287670d-b297-4845-9020-be42f33f82f2-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr\" (UID: \"8287670d-b297-4845-9020-be42f33f82f2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.204220 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rw7g\" (UniqueName: \"kubernetes.io/projected/8287670d-b297-4845-9020-be42f33f82f2-kube-api-access-8rw7g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr\" (UID: \"8287670d-b297-4845-9020-be42f33f82f2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.331374 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.899288 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr"] Oct 01 13:06:44 crc kubenswrapper[4913]: W1001 13:06:44.910383 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8287670d_b297_4845_9020_be42f33f82f2.slice/crio-364893be0c1e379a11a5d1d62e72c79e72c9dbe59a6dd4250a7dba100d30786d WatchSource:0}: Error finding container 364893be0c1e379a11a5d1d62e72c79e72c9dbe59a6dd4250a7dba100d30786d: Status 404 returned error can't find the container with id 364893be0c1e379a11a5d1d62e72c79e72c9dbe59a6dd4250a7dba100d30786d Oct 01 13:06:44 crc kubenswrapper[4913]: I1001 13:06:44.932417 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" event={"ID":"8287670d-b297-4845-9020-be42f33f82f2","Type":"ContainerStarted","Data":"364893be0c1e379a11a5d1d62e72c79e72c9dbe59a6dd4250a7dba100d30786d"} Oct 01 13:06:45 crc kubenswrapper[4913]: I1001 13:06:45.942826 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" event={"ID":"8287670d-b297-4845-9020-be42f33f82f2","Type":"ContainerStarted","Data":"1a423254bc50bb196b848503d26c963388ac847bbaa676561d3c490e5ddc5797"} Oct 01 13:06:53 crc kubenswrapper[4913]: I1001 13:06:53.806554 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:06:53 crc kubenswrapper[4913]: E1001 13:06:53.807530 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:06:54 crc kubenswrapper[4913]: I1001 13:06:54.041889 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" podStartSLOduration=10.349068308 podStartE2EDuration="11.041868505s" podCreationTimestamp="2025-10-01 13:06:43 +0000 UTC" firstStartedPulling="2025-10-01 13:06:44.913520431 +0000 UTC m=+1736.816996019" lastFinishedPulling="2025-10-01 13:06:45.606320648 +0000 UTC m=+1737.509796216" observedRunningTime="2025-10-01 13:06:45.959068431 +0000 UTC m=+1737.862544019" watchObservedRunningTime="2025-10-01 13:06:54.041868505 +0000 UTC m=+1745.945344083" Oct 01 13:06:54 crc kubenswrapper[4913]: I1001 13:06:54.045926 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-z4hjt"] Oct 01 13:06:54 crc kubenswrapper[4913]: I1001 13:06:54.054047 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-z4hjt"] Oct 01 13:06:54 crc kubenswrapper[4913]: I1001 13:06:54.821409 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56638b67-fd8f-40b1-85da-09edf48fbd46" path="/var/lib/kubelet/pods/56638b67-fd8f-40b1-85da-09edf48fbd46/volumes" Oct 01 13:06:56 crc kubenswrapper[4913]: I1001 13:06:56.042863 4913 generic.go:334] "Generic (PLEG): container finished" podID="8287670d-b297-4845-9020-be42f33f82f2" containerID="1a423254bc50bb196b848503d26c963388ac847bbaa676561d3c490e5ddc5797" exitCode=0 Oct 01 13:06:56 crc kubenswrapper[4913]: I1001 13:06:56.042964 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" event={"ID":"8287670d-b297-4845-9020-be42f33f82f2","Type":"ContainerDied","Data":"1a423254bc50bb196b848503d26c963388ac847bbaa676561d3c490e5ddc5797"} Oct 01 13:06:57 crc kubenswrapper[4913]: I1001 13:06:57.552958 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" Oct 01 13:06:57 crc kubenswrapper[4913]: I1001 13:06:57.635713 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8287670d-b297-4845-9020-be42f33f82f2-inventory\") pod \"8287670d-b297-4845-9020-be42f33f82f2\" (UID: \"8287670d-b297-4845-9020-be42f33f82f2\") " Oct 01 13:06:57 crc kubenswrapper[4913]: I1001 13:06:57.635819 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rw7g\" (UniqueName: \"kubernetes.io/projected/8287670d-b297-4845-9020-be42f33f82f2-kube-api-access-8rw7g\") pod \"8287670d-b297-4845-9020-be42f33f82f2\" (UID: \"8287670d-b297-4845-9020-be42f33f82f2\") " Oct 01 13:06:57 crc kubenswrapper[4913]: I1001 13:06:57.635916 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8287670d-b297-4845-9020-be42f33f82f2-ssh-key\") pod \"8287670d-b297-4845-9020-be42f33f82f2\" (UID: \"8287670d-b297-4845-9020-be42f33f82f2\") " Oct 01 13:06:57 crc kubenswrapper[4913]: I1001 13:06:57.653190 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8287670d-b297-4845-9020-be42f33f82f2-kube-api-access-8rw7g" (OuterVolumeSpecName: "kube-api-access-8rw7g") pod "8287670d-b297-4845-9020-be42f33f82f2" (UID: "8287670d-b297-4845-9020-be42f33f82f2"). InnerVolumeSpecName "kube-api-access-8rw7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:57 crc kubenswrapper[4913]: I1001 13:06:57.662095 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8287670d-b297-4845-9020-be42f33f82f2-inventory" (OuterVolumeSpecName: "inventory") pod "8287670d-b297-4845-9020-be42f33f82f2" (UID: "8287670d-b297-4845-9020-be42f33f82f2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:57 crc kubenswrapper[4913]: I1001 13:06:57.662512 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8287670d-b297-4845-9020-be42f33f82f2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8287670d-b297-4845-9020-be42f33f82f2" (UID: "8287670d-b297-4845-9020-be42f33f82f2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:57 crc kubenswrapper[4913]: I1001 13:06:57.737497 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8287670d-b297-4845-9020-be42f33f82f2-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:57 crc kubenswrapper[4913]: I1001 13:06:57.737522 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rw7g\" (UniqueName: \"kubernetes.io/projected/8287670d-b297-4845-9020-be42f33f82f2-kube-api-access-8rw7g\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:57 crc kubenswrapper[4913]: I1001 13:06:57.737533 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8287670d-b297-4845-9020-be42f33f82f2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:58 crc kubenswrapper[4913]: I1001 13:06:58.062709 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" event={"ID":"8287670d-b297-4845-9020-be42f33f82f2","Type":"ContainerDied","Data":"364893be0c1e379a11a5d1d62e72c79e72c9dbe59a6dd4250a7dba100d30786d"} Oct 01 13:06:58 crc kubenswrapper[4913]: I1001 13:06:58.062987 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="364893be0c1e379a11a5d1d62e72c79e72c9dbe59a6dd4250a7dba100d30786d" Oct 01 13:06:58 crc kubenswrapper[4913]: I1001 13:06:58.062804 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr" Oct 01 13:07:05 crc kubenswrapper[4913]: I1001 13:07:05.840526 4913 scope.go:117] "RemoveContainer" containerID="22b594d3a847571776800fd6b7741c6cebdb1088a933b58e14ad14dd505c11e3" Oct 01 13:07:05 crc kubenswrapper[4913]: I1001 13:07:05.878411 4913 scope.go:117] "RemoveContainer" containerID="af0568664959c58c8e024eaa00731438bdd8392c44c4691fcf4374d7f1a594cf" Oct 01 13:07:06 crc kubenswrapper[4913]: I1001 13:07:06.806897 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:07:06 crc kubenswrapper[4913]: E1001 13:07:06.807195 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:07:21 crc kubenswrapper[4913]: I1001 13:07:21.807081 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:07:21 crc kubenswrapper[4913]: E1001 13:07:21.808100 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:07:35 crc kubenswrapper[4913]: I1001 13:07:35.807212 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:07:35 crc kubenswrapper[4913]: E1001 13:07:35.807980 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:07:48 crc kubenswrapper[4913]: I1001 13:07:48.813248 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:07:49 crc kubenswrapper[4913]: I1001 13:07:49.562906 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"374f1e21af7aa5a14f018e46d1950c6956f6712b2205ecee89544e8f1a470d3e"} Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.077070 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7lvkg"] Oct 01 13:08:03 crc kubenswrapper[4913]: E1001 13:08:03.078291 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8287670d-b297-4845-9020-be42f33f82f2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.078308 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8287670d-b297-4845-9020-be42f33f82f2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.078524 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="8287670d-b297-4845-9020-be42f33f82f2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.080221 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.089490 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7lvkg"] Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.192686 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmmvx\" (UniqueName: \"kubernetes.io/projected/837ff4a1-7522-4e94-8ea7-8326f82b256c-kube-api-access-mmmvx\") pod \"community-operators-7lvkg\" (UID: \"837ff4a1-7522-4e94-8ea7-8326f82b256c\") " pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.192936 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837ff4a1-7522-4e94-8ea7-8326f82b256c-catalog-content\") pod \"community-operators-7lvkg\" (UID: \"837ff4a1-7522-4e94-8ea7-8326f82b256c\") " pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.192956 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837ff4a1-7522-4e94-8ea7-8326f82b256c-utilities\") pod \"community-operators-7lvkg\" (UID: \"837ff4a1-7522-4e94-8ea7-8326f82b256c\") " pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.294346 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmmvx\" (UniqueName: \"kubernetes.io/projected/837ff4a1-7522-4e94-8ea7-8326f82b256c-kube-api-access-mmmvx\") pod \"community-operators-7lvkg\" (UID: \"837ff4a1-7522-4e94-8ea7-8326f82b256c\") " pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.294398 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837ff4a1-7522-4e94-8ea7-8326f82b256c-catalog-content\") pod \"community-operators-7lvkg\" (UID: \"837ff4a1-7522-4e94-8ea7-8326f82b256c\") " pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.294417 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837ff4a1-7522-4e94-8ea7-8326f82b256c-utilities\") pod \"community-operators-7lvkg\" (UID: \"837ff4a1-7522-4e94-8ea7-8326f82b256c\") " pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.294879 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837ff4a1-7522-4e94-8ea7-8326f82b256c-utilities\") pod \"community-operators-7lvkg\" (UID: \"837ff4a1-7522-4e94-8ea7-8326f82b256c\") " pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.294946 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837ff4a1-7522-4e94-8ea7-8326f82b256c-catalog-content\") pod \"community-operators-7lvkg\" (UID: \"837ff4a1-7522-4e94-8ea7-8326f82b256c\") " pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.314260 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmmvx\" (UniqueName: \"kubernetes.io/projected/837ff4a1-7522-4e94-8ea7-8326f82b256c-kube-api-access-mmmvx\") pod \"community-operators-7lvkg\" (UID: \"837ff4a1-7522-4e94-8ea7-8326f82b256c\") " pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.401325 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:03 crc kubenswrapper[4913]: I1001 13:08:03.969473 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7lvkg"] Oct 01 13:08:04 crc kubenswrapper[4913]: I1001 13:08:04.698075 4913 generic.go:334] "Generic (PLEG): container finished" podID="837ff4a1-7522-4e94-8ea7-8326f82b256c" containerID="1e6b93ed3f9fcf62f0999359c24444a4c9801e66a49eeaf3076fe06667e262e1" exitCode=0 Oct 01 13:08:04 crc kubenswrapper[4913]: I1001 13:08:04.698412 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lvkg" event={"ID":"837ff4a1-7522-4e94-8ea7-8326f82b256c","Type":"ContainerDied","Data":"1e6b93ed3f9fcf62f0999359c24444a4c9801e66a49eeaf3076fe06667e262e1"} Oct 01 13:08:04 crc kubenswrapper[4913]: I1001 13:08:04.698444 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lvkg" event={"ID":"837ff4a1-7522-4e94-8ea7-8326f82b256c","Type":"ContainerStarted","Data":"73e3f0c24b8047be85cc3e8df80ddbf09d32dc203e21e7245d19a624cb76f7f5"} Oct 01 13:08:05 crc kubenswrapper[4913]: I1001 13:08:05.708609 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lvkg" event={"ID":"837ff4a1-7522-4e94-8ea7-8326f82b256c","Type":"ContainerStarted","Data":"fb2a38bb121509aae022d4516e61e74457b9153a04c7870b0a125d84ebac556c"} Oct 01 13:08:06 crc kubenswrapper[4913]: I1001 13:08:06.718382 4913 generic.go:334] "Generic (PLEG): container finished" podID="837ff4a1-7522-4e94-8ea7-8326f82b256c" containerID="fb2a38bb121509aae022d4516e61e74457b9153a04c7870b0a125d84ebac556c" exitCode=0 Oct 01 13:08:06 crc kubenswrapper[4913]: I1001 13:08:06.718415 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lvkg" event={"ID":"837ff4a1-7522-4e94-8ea7-8326f82b256c","Type":"ContainerDied","Data":"fb2a38bb121509aae022d4516e61e74457b9153a04c7870b0a125d84ebac556c"} Oct 01 13:08:07 crc kubenswrapper[4913]: I1001 13:08:07.729040 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lvkg" event={"ID":"837ff4a1-7522-4e94-8ea7-8326f82b256c","Type":"ContainerStarted","Data":"0e54ab3602b93516da598928999af5e18e8088b19b53d0ae3baba8cc883f2c50"} Oct 01 13:08:07 crc kubenswrapper[4913]: I1001 13:08:07.750351 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7lvkg" podStartSLOduration=2.289362733 podStartE2EDuration="4.750335081s" podCreationTimestamp="2025-10-01 13:08:03 +0000 UTC" firstStartedPulling="2025-10-01 13:08:04.701246301 +0000 UTC m=+1816.604721899" lastFinishedPulling="2025-10-01 13:08:07.162218679 +0000 UTC m=+1819.065694247" observedRunningTime="2025-10-01 13:08:07.747879243 +0000 UTC m=+1819.651354861" watchObservedRunningTime="2025-10-01 13:08:07.750335081 +0000 UTC m=+1819.653810659" Oct 01 13:08:13 crc kubenswrapper[4913]: I1001 13:08:13.401730 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:13 crc kubenswrapper[4913]: I1001 13:08:13.402225 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:13 crc kubenswrapper[4913]: I1001 13:08:13.471518 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:13 crc kubenswrapper[4913]: I1001 13:08:13.886534 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:14 crc kubenswrapper[4913]: I1001 13:08:14.062640 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7lvkg"] Oct 01 13:08:15 crc kubenswrapper[4913]: I1001 13:08:15.840623 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7lvkg" podUID="837ff4a1-7522-4e94-8ea7-8326f82b256c" containerName="registry-server" containerID="cri-o://0e54ab3602b93516da598928999af5e18e8088b19b53d0ae3baba8cc883f2c50" gracePeriod=2 Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.264649 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.434450 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837ff4a1-7522-4e94-8ea7-8326f82b256c-catalog-content\") pod \"837ff4a1-7522-4e94-8ea7-8326f82b256c\" (UID: \"837ff4a1-7522-4e94-8ea7-8326f82b256c\") " Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.434598 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837ff4a1-7522-4e94-8ea7-8326f82b256c-utilities\") pod \"837ff4a1-7522-4e94-8ea7-8326f82b256c\" (UID: \"837ff4a1-7522-4e94-8ea7-8326f82b256c\") " Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.435393 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmmvx\" (UniqueName: \"kubernetes.io/projected/837ff4a1-7522-4e94-8ea7-8326f82b256c-kube-api-access-mmmvx\") pod \"837ff4a1-7522-4e94-8ea7-8326f82b256c\" (UID: \"837ff4a1-7522-4e94-8ea7-8326f82b256c\") " Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.443214 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837ff4a1-7522-4e94-8ea7-8326f82b256c-utilities" (OuterVolumeSpecName: "utilities") pod "837ff4a1-7522-4e94-8ea7-8326f82b256c" (UID: "837ff4a1-7522-4e94-8ea7-8326f82b256c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.447505 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/837ff4a1-7522-4e94-8ea7-8326f82b256c-kube-api-access-mmmvx" (OuterVolumeSpecName: "kube-api-access-mmmvx") pod "837ff4a1-7522-4e94-8ea7-8326f82b256c" (UID: "837ff4a1-7522-4e94-8ea7-8326f82b256c"). InnerVolumeSpecName "kube-api-access-mmmvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.511693 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837ff4a1-7522-4e94-8ea7-8326f82b256c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "837ff4a1-7522-4e94-8ea7-8326f82b256c" (UID: "837ff4a1-7522-4e94-8ea7-8326f82b256c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.538819 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmmvx\" (UniqueName: \"kubernetes.io/projected/837ff4a1-7522-4e94-8ea7-8326f82b256c-kube-api-access-mmmvx\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.538865 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837ff4a1-7522-4e94-8ea7-8326f82b256c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.538884 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837ff4a1-7522-4e94-8ea7-8326f82b256c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.849996 4913 generic.go:334] "Generic (PLEG): container finished" podID="837ff4a1-7522-4e94-8ea7-8326f82b256c" containerID="0e54ab3602b93516da598928999af5e18e8088b19b53d0ae3baba8cc883f2c50" exitCode=0 Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.850042 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lvkg" event={"ID":"837ff4a1-7522-4e94-8ea7-8326f82b256c","Type":"ContainerDied","Data":"0e54ab3602b93516da598928999af5e18e8088b19b53d0ae3baba8cc883f2c50"} Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.850078 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lvkg" event={"ID":"837ff4a1-7522-4e94-8ea7-8326f82b256c","Type":"ContainerDied","Data":"73e3f0c24b8047be85cc3e8df80ddbf09d32dc203e21e7245d19a624cb76f7f5"} Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.850100 4913 scope.go:117] "RemoveContainer" containerID="0e54ab3602b93516da598928999af5e18e8088b19b53d0ae3baba8cc883f2c50" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.850870 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lvkg" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.873804 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7lvkg"] Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.879440 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7lvkg"] Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.881417 4913 scope.go:117] "RemoveContainer" containerID="fb2a38bb121509aae022d4516e61e74457b9153a04c7870b0a125d84ebac556c" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.908949 4913 scope.go:117] "RemoveContainer" containerID="1e6b93ed3f9fcf62f0999359c24444a4c9801e66a49eeaf3076fe06667e262e1" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.969629 4913 scope.go:117] "RemoveContainer" containerID="0e54ab3602b93516da598928999af5e18e8088b19b53d0ae3baba8cc883f2c50" Oct 01 13:08:16 crc kubenswrapper[4913]: E1001 13:08:16.970358 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e54ab3602b93516da598928999af5e18e8088b19b53d0ae3baba8cc883f2c50\": container with ID starting with 0e54ab3602b93516da598928999af5e18e8088b19b53d0ae3baba8cc883f2c50 not found: ID does not exist" containerID="0e54ab3602b93516da598928999af5e18e8088b19b53d0ae3baba8cc883f2c50" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.970420 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e54ab3602b93516da598928999af5e18e8088b19b53d0ae3baba8cc883f2c50"} err="failed to get container status \"0e54ab3602b93516da598928999af5e18e8088b19b53d0ae3baba8cc883f2c50\": rpc error: code = NotFound desc = could not find container \"0e54ab3602b93516da598928999af5e18e8088b19b53d0ae3baba8cc883f2c50\": container with ID starting with 0e54ab3602b93516da598928999af5e18e8088b19b53d0ae3baba8cc883f2c50 not found: ID does not exist" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.970448 4913 scope.go:117] "RemoveContainer" containerID="fb2a38bb121509aae022d4516e61e74457b9153a04c7870b0a125d84ebac556c" Oct 01 13:08:16 crc kubenswrapper[4913]: E1001 13:08:16.971006 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb2a38bb121509aae022d4516e61e74457b9153a04c7870b0a125d84ebac556c\": container with ID starting with fb2a38bb121509aae022d4516e61e74457b9153a04c7870b0a125d84ebac556c not found: ID does not exist" containerID="fb2a38bb121509aae022d4516e61e74457b9153a04c7870b0a125d84ebac556c" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.971151 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb2a38bb121509aae022d4516e61e74457b9153a04c7870b0a125d84ebac556c"} err="failed to get container status \"fb2a38bb121509aae022d4516e61e74457b9153a04c7870b0a125d84ebac556c\": rpc error: code = NotFound desc = could not find container \"fb2a38bb121509aae022d4516e61e74457b9153a04c7870b0a125d84ebac556c\": container with ID starting with fb2a38bb121509aae022d4516e61e74457b9153a04c7870b0a125d84ebac556c not found: ID does not exist" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.971327 4913 scope.go:117] "RemoveContainer" containerID="1e6b93ed3f9fcf62f0999359c24444a4c9801e66a49eeaf3076fe06667e262e1" Oct 01 13:08:16 crc kubenswrapper[4913]: E1001 13:08:16.971814 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e6b93ed3f9fcf62f0999359c24444a4c9801e66a49eeaf3076fe06667e262e1\": container with ID starting with 1e6b93ed3f9fcf62f0999359c24444a4c9801e66a49eeaf3076fe06667e262e1 not found: ID does not exist" containerID="1e6b93ed3f9fcf62f0999359c24444a4c9801e66a49eeaf3076fe06667e262e1" Oct 01 13:08:16 crc kubenswrapper[4913]: I1001 13:08:16.971848 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e6b93ed3f9fcf62f0999359c24444a4c9801e66a49eeaf3076fe06667e262e1"} err="failed to get container status \"1e6b93ed3f9fcf62f0999359c24444a4c9801e66a49eeaf3076fe06667e262e1\": rpc error: code = NotFound desc = could not find container \"1e6b93ed3f9fcf62f0999359c24444a4c9801e66a49eeaf3076fe06667e262e1\": container with ID starting with 1e6b93ed3f9fcf62f0999359c24444a4c9801e66a49eeaf3076fe06667e262e1 not found: ID does not exist" Oct 01 13:08:18 crc kubenswrapper[4913]: I1001 13:08:18.831620 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="837ff4a1-7522-4e94-8ea7-8326f82b256c" path="/var/lib/kubelet/pods/837ff4a1-7522-4e94-8ea7-8326f82b256c/volumes" Oct 01 13:10:10 crc kubenswrapper[4913]: I1001 13:10:10.083723 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:10:10 crc kubenswrapper[4913]: I1001 13:10:10.084433 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:10:40 crc kubenswrapper[4913]: I1001 13:10:40.083965 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:10:40 crc kubenswrapper[4913]: I1001 13:10:40.084984 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:11:10 crc kubenswrapper[4913]: I1001 13:11:10.083351 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:11:10 crc kubenswrapper[4913]: I1001 13:11:10.084006 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:11:10 crc kubenswrapper[4913]: I1001 13:11:10.084052 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 13:11:10 crc kubenswrapper[4913]: I1001 13:11:10.084851 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"374f1e21af7aa5a14f018e46d1950c6956f6712b2205ecee89544e8f1a470d3e"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:11:10 crc kubenswrapper[4913]: I1001 13:11:10.084907 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://374f1e21af7aa5a14f018e46d1950c6956f6712b2205ecee89544e8f1a470d3e" gracePeriod=600 Oct 01 13:11:10 crc kubenswrapper[4913]: I1001 13:11:10.470678 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="374f1e21af7aa5a14f018e46d1950c6956f6712b2205ecee89544e8f1a470d3e" exitCode=0 Oct 01 13:11:10 crc kubenswrapper[4913]: I1001 13:11:10.470756 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"374f1e21af7aa5a14f018e46d1950c6956f6712b2205ecee89544e8f1a470d3e"} Oct 01 13:11:10 crc kubenswrapper[4913]: I1001 13:11:10.471082 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5"} Oct 01 13:11:10 crc kubenswrapper[4913]: I1001 13:11:10.471103 4913 scope.go:117] "RemoveContainer" containerID="f0db22f9095c97c462e86d824512a9593b31665e4ac2ce14852d2cf4b7fe0ca7" Oct 01 13:11:57 crc kubenswrapper[4913]: I1001 13:11:57.998600 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.009089 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.018585 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v6kx7"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.024732 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.037607 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.045849 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.053105 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.059417 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.065467 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.072383 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pwrjr"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.080303 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6cdwb"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.086771 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rfrj7"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.093615 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-56g4q"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.100420 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tn4p4"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.106347 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.113214 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.120788 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2hdgx"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.126902 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m9swr"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.132355 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-56g4q"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.154438 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hq6xk"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.163101 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4m44z"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.171912 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-d2n26"] Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.826697 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03d5a3d9-1379-4064-86b7-8f1dc11729a4" path="/var/lib/kubelet/pods/03d5a3d9-1379-4064-86b7-8f1dc11729a4/volumes" Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.828448 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38ea86ed-6acc-47b0-b8cf-92a7f2c44428" path="/var/lib/kubelet/pods/38ea86ed-6acc-47b0-b8cf-92a7f2c44428/volumes" Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.829643 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dd227f3-5b5c-4154-9b86-e0a2c2bfe591" path="/var/lib/kubelet/pods/4dd227f3-5b5c-4154-9b86-e0a2c2bfe591/volumes" Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.831006 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f1c1c18-0832-49de-88ae-40f2bc5be31f" path="/var/lib/kubelet/pods/5f1c1c18-0832-49de-88ae-40f2bc5be31f/volumes" Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.833364 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8287670d-b297-4845-9020-be42f33f82f2" path="/var/lib/kubelet/pods/8287670d-b297-4845-9020-be42f33f82f2/volumes" Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.835164 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a50a246d-1d3c-4764-aab3-3615be090051" path="/var/lib/kubelet/pods/a50a246d-1d3c-4764-aab3-3615be090051/volumes" Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.836525 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bed26054-0c81-4aa3-93e9-8ad2f2200f86" path="/var/lib/kubelet/pods/bed26054-0c81-4aa3-93e9-8ad2f2200f86/volumes" Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.839315 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c" path="/var/lib/kubelet/pods/cdfbdd3e-25dc-4e03-b5b1-c7cdfe9e1f9c/volumes" Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.841263 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e576334d-7925-4127-b1b9-2d613614437f" path="/var/lib/kubelet/pods/e576334d-7925-4127-b1b9-2d613614437f/volumes" Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.844634 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb05f7e6-1dd6-4893-8394-8e88e69f141f" path="/var/lib/kubelet/pods/fb05f7e6-1dd6-4893-8394-8e88e69f141f/volumes" Oct 01 13:11:58 crc kubenswrapper[4913]: I1001 13:11:58.845981 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff0f29f2-344b-41d8-aea2-7d29e013aeec" path="/var/lib/kubelet/pods/ff0f29f2-344b-41d8-aea2-7d29e013aeec/volumes" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.474956 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l"] Oct 01 13:12:04 crc kubenswrapper[4913]: E1001 13:12:04.476156 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837ff4a1-7522-4e94-8ea7-8326f82b256c" containerName="extract-content" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.476192 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="837ff4a1-7522-4e94-8ea7-8326f82b256c" containerName="extract-content" Oct 01 13:12:04 crc kubenswrapper[4913]: E1001 13:12:04.476245 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837ff4a1-7522-4e94-8ea7-8326f82b256c" containerName="extract-utilities" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.476299 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="837ff4a1-7522-4e94-8ea7-8326f82b256c" containerName="extract-utilities" Oct 01 13:12:04 crc kubenswrapper[4913]: E1001 13:12:04.476335 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837ff4a1-7522-4e94-8ea7-8326f82b256c" containerName="registry-server" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.476355 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="837ff4a1-7522-4e94-8ea7-8326f82b256c" containerName="registry-server" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.476851 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="837ff4a1-7522-4e94-8ea7-8326f82b256c" containerName="registry-server" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.479654 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.481900 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.482599 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.482954 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.483498 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.484325 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.495984 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l"] Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.647909 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.648054 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sng66\" (UniqueName: \"kubernetes.io/projected/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-kube-api-access-sng66\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.648125 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.648398 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.648472 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.749929 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.750038 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sng66\" (UniqueName: \"kubernetes.io/projected/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-kube-api-access-sng66\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.750107 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.750263 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.750331 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.762150 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.762320 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.762898 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.764127 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.787484 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sng66\" (UniqueName: \"kubernetes.io/projected/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-kube-api-access-sng66\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:04 crc kubenswrapper[4913]: I1001 13:12:04.813808 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:05 crc kubenswrapper[4913]: I1001 13:12:05.449519 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l"] Oct 01 13:12:05 crc kubenswrapper[4913]: I1001 13:12:05.457498 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:12:05 crc kubenswrapper[4913]: I1001 13:12:05.952882 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" event={"ID":"5a28de19-ceb2-4b36-ae51-1b69d134b6fd","Type":"ContainerStarted","Data":"3ca443462d304846d5c0fdf5d18d2b48c8f5c3bb2f31161fcfac94d8b0993248"} Oct 01 13:12:06 crc kubenswrapper[4913]: I1001 13:12:06.064144 4913 scope.go:117] "RemoveContainer" containerID="378edc9f10c714ad8b9ae64357adec5085cee223828e2ab2ecf3ef2a1e764ffc" Oct 01 13:12:06 crc kubenswrapper[4913]: I1001 13:12:06.123826 4913 scope.go:117] "RemoveContainer" containerID="5c32704683e7da980f7b5fbab4a2da43715e012432346c1b88443a5f1bce7c2f" Oct 01 13:12:06 crc kubenswrapper[4913]: I1001 13:12:06.196140 4913 scope.go:117] "RemoveContainer" containerID="2356606cf64cc9eeabdb4c4d2edd3914e766727182e8deafc0bf581c5f9cc03c" Oct 01 13:12:06 crc kubenswrapper[4913]: I1001 13:12:06.219996 4913 scope.go:117] "RemoveContainer" containerID="b24d119dd29c4c04a767344d0a98df004efe8b045040a98d6bc94f69f5ca4f44" Oct 01 13:12:06 crc kubenswrapper[4913]: I1001 13:12:06.280368 4913 scope.go:117] "RemoveContainer" containerID="21710f16105cc155ca639d83af3cb105eeaf4904cbe8499254132c8af953e217" Oct 01 13:12:06 crc kubenswrapper[4913]: I1001 13:12:06.311577 4913 scope.go:117] "RemoveContainer" containerID="ddca2dee667cc931bba0e873e748681b1ad4ff16f4afe97d82d6d2ba2d6afab1" Oct 01 13:12:06 crc kubenswrapper[4913]: I1001 13:12:06.354375 4913 scope.go:117] "RemoveContainer" containerID="dc058e8d0782c134e52ac03f6d2e26fe129be7c52950ba4a1bd8d5f0bdba7661" Oct 01 13:12:06 crc kubenswrapper[4913]: I1001 13:12:06.391896 4913 scope.go:117] "RemoveContainer" containerID="066f85645a28e552a720991dd457c68fdaebe90e2675cdbfeeed9f2ac7145ed6" Oct 01 13:12:06 crc kubenswrapper[4913]: I1001 13:12:06.962785 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" event={"ID":"5a28de19-ceb2-4b36-ae51-1b69d134b6fd","Type":"ContainerStarted","Data":"db66144426ff8460131b5e5da3d62e4379d094d17fb6f9a486ebc3be083a4039"} Oct 01 13:12:06 crc kubenswrapper[4913]: I1001 13:12:06.985805 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" podStartSLOduration=2.1191149 podStartE2EDuration="2.985785419s" podCreationTimestamp="2025-10-01 13:12:04 +0000 UTC" firstStartedPulling="2025-10-01 13:12:05.457237784 +0000 UTC m=+2057.360713362" lastFinishedPulling="2025-10-01 13:12:06.323908303 +0000 UTC m=+2058.227383881" observedRunningTime="2025-10-01 13:12:06.979006821 +0000 UTC m=+2058.882482419" watchObservedRunningTime="2025-10-01 13:12:06.985785419 +0000 UTC m=+2058.889260997" Oct 01 13:12:13 crc kubenswrapper[4913]: I1001 13:12:13.941084 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h2m5l"] Oct 01 13:12:13 crc kubenswrapper[4913]: I1001 13:12:13.943902 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:13 crc kubenswrapper[4913]: I1001 13:12:13.952002 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2m5l"] Oct 01 13:12:14 crc kubenswrapper[4913]: I1001 13:12:14.023639 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnt42\" (UniqueName: \"kubernetes.io/projected/30cf706f-ad3d-4ac4-b298-9239d73aedc1-kube-api-access-gnt42\") pod \"redhat-marketplace-h2m5l\" (UID: \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\") " pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:14 crc kubenswrapper[4913]: I1001 13:12:14.023712 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30cf706f-ad3d-4ac4-b298-9239d73aedc1-utilities\") pod \"redhat-marketplace-h2m5l\" (UID: \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\") " pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:14 crc kubenswrapper[4913]: I1001 13:12:14.023731 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30cf706f-ad3d-4ac4-b298-9239d73aedc1-catalog-content\") pod \"redhat-marketplace-h2m5l\" (UID: \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\") " pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:14 crc kubenswrapper[4913]: I1001 13:12:14.125617 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnt42\" (UniqueName: \"kubernetes.io/projected/30cf706f-ad3d-4ac4-b298-9239d73aedc1-kube-api-access-gnt42\") pod \"redhat-marketplace-h2m5l\" (UID: \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\") " pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:14 crc kubenswrapper[4913]: I1001 13:12:14.125685 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30cf706f-ad3d-4ac4-b298-9239d73aedc1-utilities\") pod \"redhat-marketplace-h2m5l\" (UID: \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\") " pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:14 crc kubenswrapper[4913]: I1001 13:12:14.125705 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30cf706f-ad3d-4ac4-b298-9239d73aedc1-catalog-content\") pod \"redhat-marketplace-h2m5l\" (UID: \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\") " pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:14 crc kubenswrapper[4913]: I1001 13:12:14.126289 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30cf706f-ad3d-4ac4-b298-9239d73aedc1-catalog-content\") pod \"redhat-marketplace-h2m5l\" (UID: \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\") " pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:14 crc kubenswrapper[4913]: I1001 13:12:14.126260 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30cf706f-ad3d-4ac4-b298-9239d73aedc1-utilities\") pod \"redhat-marketplace-h2m5l\" (UID: \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\") " pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:14 crc kubenswrapper[4913]: I1001 13:12:14.147082 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnt42\" (UniqueName: \"kubernetes.io/projected/30cf706f-ad3d-4ac4-b298-9239d73aedc1-kube-api-access-gnt42\") pod \"redhat-marketplace-h2m5l\" (UID: \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\") " pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:14 crc kubenswrapper[4913]: I1001 13:12:14.266191 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:14 crc kubenswrapper[4913]: I1001 13:12:14.711964 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2m5l"] Oct 01 13:12:15 crc kubenswrapper[4913]: I1001 13:12:15.022653 4913 generic.go:334] "Generic (PLEG): container finished" podID="30cf706f-ad3d-4ac4-b298-9239d73aedc1" containerID="04fb08f1bc9de4f26704cf44fea99fcac64139b969deefbcd3f3cf2bb11fe4c2" exitCode=0 Oct 01 13:12:15 crc kubenswrapper[4913]: I1001 13:12:15.022760 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2m5l" event={"ID":"30cf706f-ad3d-4ac4-b298-9239d73aedc1","Type":"ContainerDied","Data":"04fb08f1bc9de4f26704cf44fea99fcac64139b969deefbcd3f3cf2bb11fe4c2"} Oct 01 13:12:15 crc kubenswrapper[4913]: I1001 13:12:15.022991 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2m5l" event={"ID":"30cf706f-ad3d-4ac4-b298-9239d73aedc1","Type":"ContainerStarted","Data":"4d4c056343fefa57fe724cd19fe1e4e4ae38368237aeec44008f5a6bb097ac80"} Oct 01 13:12:16 crc kubenswrapper[4913]: I1001 13:12:16.033166 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2m5l" event={"ID":"30cf706f-ad3d-4ac4-b298-9239d73aedc1","Type":"ContainerStarted","Data":"9289e668ea24337ffa01b44e2d3c2df383f06e6fae5919ad40a854fcb1de7dd4"} Oct 01 13:12:17 crc kubenswrapper[4913]: I1001 13:12:17.042622 4913 generic.go:334] "Generic (PLEG): container finished" podID="30cf706f-ad3d-4ac4-b298-9239d73aedc1" containerID="9289e668ea24337ffa01b44e2d3c2df383f06e6fae5919ad40a854fcb1de7dd4" exitCode=0 Oct 01 13:12:17 crc kubenswrapper[4913]: I1001 13:12:17.042679 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2m5l" event={"ID":"30cf706f-ad3d-4ac4-b298-9239d73aedc1","Type":"ContainerDied","Data":"9289e668ea24337ffa01b44e2d3c2df383f06e6fae5919ad40a854fcb1de7dd4"} Oct 01 13:12:18 crc kubenswrapper[4913]: I1001 13:12:18.053392 4913 generic.go:334] "Generic (PLEG): container finished" podID="5a28de19-ceb2-4b36-ae51-1b69d134b6fd" containerID="db66144426ff8460131b5e5da3d62e4379d094d17fb6f9a486ebc3be083a4039" exitCode=0 Oct 01 13:12:18 crc kubenswrapper[4913]: I1001 13:12:18.053642 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" event={"ID":"5a28de19-ceb2-4b36-ae51-1b69d134b6fd","Type":"ContainerDied","Data":"db66144426ff8460131b5e5da3d62e4379d094d17fb6f9a486ebc3be083a4039"} Oct 01 13:12:18 crc kubenswrapper[4913]: I1001 13:12:18.056494 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2m5l" event={"ID":"30cf706f-ad3d-4ac4-b298-9239d73aedc1","Type":"ContainerStarted","Data":"ac81bc25ffa57e4b0d7013917e2b0465bf98c8878bc6057a55d526429991ae27"} Oct 01 13:12:18 crc kubenswrapper[4913]: I1001 13:12:18.096927 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h2m5l" podStartSLOduration=2.361556745 podStartE2EDuration="5.096902534s" podCreationTimestamp="2025-10-01 13:12:13 +0000 UTC" firstStartedPulling="2025-10-01 13:12:15.024964 +0000 UTC m=+2066.928439578" lastFinishedPulling="2025-10-01 13:12:17.760309779 +0000 UTC m=+2069.663785367" observedRunningTime="2025-10-01 13:12:18.092205771 +0000 UTC m=+2069.995681369" watchObservedRunningTime="2025-10-01 13:12:18.096902534 +0000 UTC m=+2070.000378142" Oct 01 13:12:18 crc kubenswrapper[4913]: E1001 13:12:18.106857 4913 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a28de19_ceb2_4b36_ae51_1b69d134b6fd.slice/crio-conmon-db66144426ff8460131b5e5da3d62e4379d094d17fb6f9a486ebc3be083a4039.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.552534 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.628873 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-inventory\") pod \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.628918 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-ceph\") pod \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.628955 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-ssh-key\") pod \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.629011 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sng66\" (UniqueName: \"kubernetes.io/projected/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-kube-api-access-sng66\") pod \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.629156 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-repo-setup-combined-ca-bundle\") pod \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\" (UID: \"5a28de19-ceb2-4b36-ae51-1b69d134b6fd\") " Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.634453 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "5a28de19-ceb2-4b36-ae51-1b69d134b6fd" (UID: "5a28de19-ceb2-4b36-ae51-1b69d134b6fd"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.634531 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-kube-api-access-sng66" (OuterVolumeSpecName: "kube-api-access-sng66") pod "5a28de19-ceb2-4b36-ae51-1b69d134b6fd" (UID: "5a28de19-ceb2-4b36-ae51-1b69d134b6fd"). InnerVolumeSpecName "kube-api-access-sng66". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.634628 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-ceph" (OuterVolumeSpecName: "ceph") pod "5a28de19-ceb2-4b36-ae51-1b69d134b6fd" (UID: "5a28de19-ceb2-4b36-ae51-1b69d134b6fd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.659429 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-inventory" (OuterVolumeSpecName: "inventory") pod "5a28de19-ceb2-4b36-ae51-1b69d134b6fd" (UID: "5a28de19-ceb2-4b36-ae51-1b69d134b6fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.671207 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5a28de19-ceb2-4b36-ae51-1b69d134b6fd" (UID: "5a28de19-ceb2-4b36-ae51-1b69d134b6fd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.730734 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sng66\" (UniqueName: \"kubernetes.io/projected/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-kube-api-access-sng66\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.730757 4913 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.730769 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.730778 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:19 crc kubenswrapper[4913]: I1001 13:12:19.730786 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a28de19-ceb2-4b36-ae51-1b69d134b6fd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.073522 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" event={"ID":"5a28de19-ceb2-4b36-ae51-1b69d134b6fd","Type":"ContainerDied","Data":"3ca443462d304846d5c0fdf5d18d2b48c8f5c3bb2f31161fcfac94d8b0993248"} Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.073561 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ca443462d304846d5c0fdf5d18d2b48c8f5c3bb2f31161fcfac94d8b0993248" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.073584 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.227648 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg"] Oct 01 13:12:20 crc kubenswrapper[4913]: E1001 13:12:20.228174 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a28de19-ceb2-4b36-ae51-1b69d134b6fd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.228204 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a28de19-ceb2-4b36-ae51-1b69d134b6fd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.228665 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a28de19-ceb2-4b36-ae51-1b69d134b6fd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.229683 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.240034 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg"] Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.240615 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.240853 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.241002 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.241754 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.242104 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.340022 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.340064 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.340096 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.340121 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8b8n\" (UniqueName: \"kubernetes.io/projected/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-kube-api-access-z8b8n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.340202 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.442073 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.442560 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.442698 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.442819 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.442938 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8b8n\" (UniqueName: \"kubernetes.io/projected/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-kube-api-access-z8b8n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.447829 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.448223 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.448549 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.449522 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.462165 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8b8n\" (UniqueName: \"kubernetes.io/projected/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-kube-api-access-z8b8n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:20 crc kubenswrapper[4913]: I1001 13:12:20.561994 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:12:21 crc kubenswrapper[4913]: I1001 13:12:21.160362 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg"] Oct 01 13:12:22 crc kubenswrapper[4913]: I1001 13:12:22.093545 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" event={"ID":"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02","Type":"ContainerStarted","Data":"2c1e5f8a8419b505d557b558a0706bbcc7c29f7da6bbb51204c169dace1f180d"} Oct 01 13:12:22 crc kubenswrapper[4913]: I1001 13:12:22.093791 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" event={"ID":"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02","Type":"ContainerStarted","Data":"4ffcfda2ddd32f22424dad6c1a0e2e6a687560b4fcb32682a603094b1fa3c568"} Oct 01 13:12:22 crc kubenswrapper[4913]: I1001 13:12:22.115942 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" podStartSLOduration=1.678115063 podStartE2EDuration="2.115918223s" podCreationTimestamp="2025-10-01 13:12:20 +0000 UTC" firstStartedPulling="2025-10-01 13:12:21.16958509 +0000 UTC m=+2073.073060668" lastFinishedPulling="2025-10-01 13:12:21.60738825 +0000 UTC m=+2073.510863828" observedRunningTime="2025-10-01 13:12:22.113394598 +0000 UTC m=+2074.016870206" watchObservedRunningTime="2025-10-01 13:12:22.115918223 +0000 UTC m=+2074.019393801" Oct 01 13:12:24 crc kubenswrapper[4913]: I1001 13:12:24.266654 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:24 crc kubenswrapper[4913]: I1001 13:12:24.266881 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:24 crc kubenswrapper[4913]: I1001 13:12:24.313340 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:25 crc kubenswrapper[4913]: I1001 13:12:25.164970 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:25 crc kubenswrapper[4913]: I1001 13:12:25.230794 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2m5l"] Oct 01 13:12:27 crc kubenswrapper[4913]: I1001 13:12:27.137332 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h2m5l" podUID="30cf706f-ad3d-4ac4-b298-9239d73aedc1" containerName="registry-server" containerID="cri-o://ac81bc25ffa57e4b0d7013917e2b0465bf98c8878bc6057a55d526429991ae27" gracePeriod=2 Oct 01 13:12:27 crc kubenswrapper[4913]: I1001 13:12:27.573972 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:27 crc kubenswrapper[4913]: I1001 13:12:27.674666 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnt42\" (UniqueName: \"kubernetes.io/projected/30cf706f-ad3d-4ac4-b298-9239d73aedc1-kube-api-access-gnt42\") pod \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\" (UID: \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\") " Oct 01 13:12:27 crc kubenswrapper[4913]: I1001 13:12:27.674764 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30cf706f-ad3d-4ac4-b298-9239d73aedc1-utilities\") pod \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\" (UID: \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\") " Oct 01 13:12:27 crc kubenswrapper[4913]: I1001 13:12:27.674967 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30cf706f-ad3d-4ac4-b298-9239d73aedc1-catalog-content\") pod \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\" (UID: \"30cf706f-ad3d-4ac4-b298-9239d73aedc1\") " Oct 01 13:12:27 crc kubenswrapper[4913]: I1001 13:12:27.676662 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30cf706f-ad3d-4ac4-b298-9239d73aedc1-utilities" (OuterVolumeSpecName: "utilities") pod "30cf706f-ad3d-4ac4-b298-9239d73aedc1" (UID: "30cf706f-ad3d-4ac4-b298-9239d73aedc1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:12:27 crc kubenswrapper[4913]: I1001 13:12:27.683522 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30cf706f-ad3d-4ac4-b298-9239d73aedc1-kube-api-access-gnt42" (OuterVolumeSpecName: "kube-api-access-gnt42") pod "30cf706f-ad3d-4ac4-b298-9239d73aedc1" (UID: "30cf706f-ad3d-4ac4-b298-9239d73aedc1"). InnerVolumeSpecName "kube-api-access-gnt42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:27 crc kubenswrapper[4913]: I1001 13:12:27.688399 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30cf706f-ad3d-4ac4-b298-9239d73aedc1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30cf706f-ad3d-4ac4-b298-9239d73aedc1" (UID: "30cf706f-ad3d-4ac4-b298-9239d73aedc1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:12:27 crc kubenswrapper[4913]: I1001 13:12:27.777465 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30cf706f-ad3d-4ac4-b298-9239d73aedc1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:27 crc kubenswrapper[4913]: I1001 13:12:27.777521 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnt42\" (UniqueName: \"kubernetes.io/projected/30cf706f-ad3d-4ac4-b298-9239d73aedc1-kube-api-access-gnt42\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:27 crc kubenswrapper[4913]: I1001 13:12:27.777539 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30cf706f-ad3d-4ac4-b298-9239d73aedc1-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.147211 4913 generic.go:334] "Generic (PLEG): container finished" podID="30cf706f-ad3d-4ac4-b298-9239d73aedc1" containerID="ac81bc25ffa57e4b0d7013917e2b0465bf98c8878bc6057a55d526429991ae27" exitCode=0 Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.147252 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2m5l" event={"ID":"30cf706f-ad3d-4ac4-b298-9239d73aedc1","Type":"ContainerDied","Data":"ac81bc25ffa57e4b0d7013917e2b0465bf98c8878bc6057a55d526429991ae27"} Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.147289 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2m5l" event={"ID":"30cf706f-ad3d-4ac4-b298-9239d73aedc1","Type":"ContainerDied","Data":"4d4c056343fefa57fe724cd19fe1e4e4ae38368237aeec44008f5a6bb097ac80"} Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.147305 4913 scope.go:117] "RemoveContainer" containerID="ac81bc25ffa57e4b0d7013917e2b0465bf98c8878bc6057a55d526429991ae27" Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.147438 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2m5l" Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.187534 4913 scope.go:117] "RemoveContainer" containerID="9289e668ea24337ffa01b44e2d3c2df383f06e6fae5919ad40a854fcb1de7dd4" Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.187767 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2m5l"] Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.196298 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2m5l"] Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.209637 4913 scope.go:117] "RemoveContainer" containerID="04fb08f1bc9de4f26704cf44fea99fcac64139b969deefbcd3f3cf2bb11fe4c2" Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.296224 4913 scope.go:117] "RemoveContainer" containerID="ac81bc25ffa57e4b0d7013917e2b0465bf98c8878bc6057a55d526429991ae27" Oct 01 13:12:28 crc kubenswrapper[4913]: E1001 13:12:28.297913 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac81bc25ffa57e4b0d7013917e2b0465bf98c8878bc6057a55d526429991ae27\": container with ID starting with ac81bc25ffa57e4b0d7013917e2b0465bf98c8878bc6057a55d526429991ae27 not found: ID does not exist" containerID="ac81bc25ffa57e4b0d7013917e2b0465bf98c8878bc6057a55d526429991ae27" Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.297953 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac81bc25ffa57e4b0d7013917e2b0465bf98c8878bc6057a55d526429991ae27"} err="failed to get container status \"ac81bc25ffa57e4b0d7013917e2b0465bf98c8878bc6057a55d526429991ae27\": rpc error: code = NotFound desc = could not find container \"ac81bc25ffa57e4b0d7013917e2b0465bf98c8878bc6057a55d526429991ae27\": container with ID starting with ac81bc25ffa57e4b0d7013917e2b0465bf98c8878bc6057a55d526429991ae27 not found: ID does not exist" Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.297976 4913 scope.go:117] "RemoveContainer" containerID="9289e668ea24337ffa01b44e2d3c2df383f06e6fae5919ad40a854fcb1de7dd4" Oct 01 13:12:28 crc kubenswrapper[4913]: E1001 13:12:28.302216 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9289e668ea24337ffa01b44e2d3c2df383f06e6fae5919ad40a854fcb1de7dd4\": container with ID starting with 9289e668ea24337ffa01b44e2d3c2df383f06e6fae5919ad40a854fcb1de7dd4 not found: ID does not exist" containerID="9289e668ea24337ffa01b44e2d3c2df383f06e6fae5919ad40a854fcb1de7dd4" Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.302252 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9289e668ea24337ffa01b44e2d3c2df383f06e6fae5919ad40a854fcb1de7dd4"} err="failed to get container status \"9289e668ea24337ffa01b44e2d3c2df383f06e6fae5919ad40a854fcb1de7dd4\": rpc error: code = NotFound desc = could not find container \"9289e668ea24337ffa01b44e2d3c2df383f06e6fae5919ad40a854fcb1de7dd4\": container with ID starting with 9289e668ea24337ffa01b44e2d3c2df383f06e6fae5919ad40a854fcb1de7dd4 not found: ID does not exist" Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.302286 4913 scope.go:117] "RemoveContainer" containerID="04fb08f1bc9de4f26704cf44fea99fcac64139b969deefbcd3f3cf2bb11fe4c2" Oct 01 13:12:28 crc kubenswrapper[4913]: E1001 13:12:28.303245 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04fb08f1bc9de4f26704cf44fea99fcac64139b969deefbcd3f3cf2bb11fe4c2\": container with ID starting with 04fb08f1bc9de4f26704cf44fea99fcac64139b969deefbcd3f3cf2bb11fe4c2 not found: ID does not exist" containerID="04fb08f1bc9de4f26704cf44fea99fcac64139b969deefbcd3f3cf2bb11fe4c2" Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.303292 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04fb08f1bc9de4f26704cf44fea99fcac64139b969deefbcd3f3cf2bb11fe4c2"} err="failed to get container status \"04fb08f1bc9de4f26704cf44fea99fcac64139b969deefbcd3f3cf2bb11fe4c2\": rpc error: code = NotFound desc = could not find container \"04fb08f1bc9de4f26704cf44fea99fcac64139b969deefbcd3f3cf2bb11fe4c2\": container with ID starting with 04fb08f1bc9de4f26704cf44fea99fcac64139b969deefbcd3f3cf2bb11fe4c2 not found: ID does not exist" Oct 01 13:12:28 crc kubenswrapper[4913]: E1001 13:12:28.390588 4913 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30cf706f_ad3d_4ac4_b298_9239d73aedc1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30cf706f_ad3d_4ac4_b298_9239d73aedc1.slice/crio-4d4c056343fefa57fe724cd19fe1e4e4ae38368237aeec44008f5a6bb097ac80\": RecentStats: unable to find data in memory cache]" Oct 01 13:12:28 crc kubenswrapper[4913]: I1001 13:12:28.817880 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30cf706f-ad3d-4ac4-b298-9239d73aedc1" path="/var/lib/kubelet/pods/30cf706f-ad3d-4ac4-b298-9239d73aedc1/volumes" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.352136 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bd6h2"] Oct 01 13:12:38 crc kubenswrapper[4913]: E1001 13:12:38.359607 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30cf706f-ad3d-4ac4-b298-9239d73aedc1" containerName="extract-content" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.359648 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="30cf706f-ad3d-4ac4-b298-9239d73aedc1" containerName="extract-content" Oct 01 13:12:38 crc kubenswrapper[4913]: E1001 13:12:38.359689 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30cf706f-ad3d-4ac4-b298-9239d73aedc1" containerName="registry-server" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.359699 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="30cf706f-ad3d-4ac4-b298-9239d73aedc1" containerName="registry-server" Oct 01 13:12:38 crc kubenswrapper[4913]: E1001 13:12:38.359917 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30cf706f-ad3d-4ac4-b298-9239d73aedc1" containerName="extract-utilities" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.359927 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="30cf706f-ad3d-4ac4-b298-9239d73aedc1" containerName="extract-utilities" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.374912 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="30cf706f-ad3d-4ac4-b298-9239d73aedc1" containerName="registry-server" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.377808 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.392482 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bd6h2"] Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.483625 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f26e04-2409-4607-ab98-ab311ea3cc50-utilities\") pod \"certified-operators-bd6h2\" (UID: \"08f26e04-2409-4607-ab98-ab311ea3cc50\") " pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.483790 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqtp7\" (UniqueName: \"kubernetes.io/projected/08f26e04-2409-4607-ab98-ab311ea3cc50-kube-api-access-tqtp7\") pod \"certified-operators-bd6h2\" (UID: \"08f26e04-2409-4607-ab98-ab311ea3cc50\") " pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.483819 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f26e04-2409-4607-ab98-ab311ea3cc50-catalog-content\") pod \"certified-operators-bd6h2\" (UID: \"08f26e04-2409-4607-ab98-ab311ea3cc50\") " pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.585183 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqtp7\" (UniqueName: \"kubernetes.io/projected/08f26e04-2409-4607-ab98-ab311ea3cc50-kube-api-access-tqtp7\") pod \"certified-operators-bd6h2\" (UID: \"08f26e04-2409-4607-ab98-ab311ea3cc50\") " pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.585234 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f26e04-2409-4607-ab98-ab311ea3cc50-catalog-content\") pod \"certified-operators-bd6h2\" (UID: \"08f26e04-2409-4607-ab98-ab311ea3cc50\") " pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.585321 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f26e04-2409-4607-ab98-ab311ea3cc50-utilities\") pod \"certified-operators-bd6h2\" (UID: \"08f26e04-2409-4607-ab98-ab311ea3cc50\") " pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.585863 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f26e04-2409-4607-ab98-ab311ea3cc50-utilities\") pod \"certified-operators-bd6h2\" (UID: \"08f26e04-2409-4607-ab98-ab311ea3cc50\") " pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.586468 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f26e04-2409-4607-ab98-ab311ea3cc50-catalog-content\") pod \"certified-operators-bd6h2\" (UID: \"08f26e04-2409-4607-ab98-ab311ea3cc50\") " pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.606402 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqtp7\" (UniqueName: \"kubernetes.io/projected/08f26e04-2409-4607-ab98-ab311ea3cc50-kube-api-access-tqtp7\") pod \"certified-operators-bd6h2\" (UID: \"08f26e04-2409-4607-ab98-ab311ea3cc50\") " pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:38 crc kubenswrapper[4913]: I1001 13:12:38.704485 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:39 crc kubenswrapper[4913]: I1001 13:12:39.252674 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bd6h2"] Oct 01 13:12:39 crc kubenswrapper[4913]: W1001 13:12:39.260507 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08f26e04_2409_4607_ab98_ab311ea3cc50.slice/crio-abe314ddd91b0ad8a005d91abf3cf1f269e33abc9d987636a291e19f2b6640dd WatchSource:0}: Error finding container abe314ddd91b0ad8a005d91abf3cf1f269e33abc9d987636a291e19f2b6640dd: Status 404 returned error can't find the container with id abe314ddd91b0ad8a005d91abf3cf1f269e33abc9d987636a291e19f2b6640dd Oct 01 13:12:40 crc kubenswrapper[4913]: I1001 13:12:40.249915 4913 generic.go:334] "Generic (PLEG): container finished" podID="08f26e04-2409-4607-ab98-ab311ea3cc50" containerID="6d2c2a05561381c0d03ee421a251d486102c59d857717c66aca0175fa0cff3ec" exitCode=0 Oct 01 13:12:40 crc kubenswrapper[4913]: I1001 13:12:40.249970 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd6h2" event={"ID":"08f26e04-2409-4607-ab98-ab311ea3cc50","Type":"ContainerDied","Data":"6d2c2a05561381c0d03ee421a251d486102c59d857717c66aca0175fa0cff3ec"} Oct 01 13:12:40 crc kubenswrapper[4913]: I1001 13:12:40.250354 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd6h2" event={"ID":"08f26e04-2409-4607-ab98-ab311ea3cc50","Type":"ContainerStarted","Data":"abe314ddd91b0ad8a005d91abf3cf1f269e33abc9d987636a291e19f2b6640dd"} Oct 01 13:12:41 crc kubenswrapper[4913]: I1001 13:12:41.260478 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd6h2" event={"ID":"08f26e04-2409-4607-ab98-ab311ea3cc50","Type":"ContainerStarted","Data":"dcd46fe622442c742865d18efda9d26ead94588f8fad45f3ee1bd27bf2bce414"} Oct 01 13:12:42 crc kubenswrapper[4913]: I1001 13:12:42.269579 4913 generic.go:334] "Generic (PLEG): container finished" podID="08f26e04-2409-4607-ab98-ab311ea3cc50" containerID="dcd46fe622442c742865d18efda9d26ead94588f8fad45f3ee1bd27bf2bce414" exitCode=0 Oct 01 13:12:42 crc kubenswrapper[4913]: I1001 13:12:42.269636 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd6h2" event={"ID":"08f26e04-2409-4607-ab98-ab311ea3cc50","Type":"ContainerDied","Data":"dcd46fe622442c742865d18efda9d26ead94588f8fad45f3ee1bd27bf2bce414"} Oct 01 13:12:43 crc kubenswrapper[4913]: I1001 13:12:43.282393 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd6h2" event={"ID":"08f26e04-2409-4607-ab98-ab311ea3cc50","Type":"ContainerStarted","Data":"5fd264ee30a4ff27338b5e0a18194451cc3fbcf92abe5f41974f48d9b6c971d9"} Oct 01 13:12:43 crc kubenswrapper[4913]: I1001 13:12:43.322046 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bd6h2" podStartSLOduration=2.734140494 podStartE2EDuration="5.322020134s" podCreationTimestamp="2025-10-01 13:12:38 +0000 UTC" firstStartedPulling="2025-10-01 13:12:40.254136313 +0000 UTC m=+2092.157611931" lastFinishedPulling="2025-10-01 13:12:42.842015993 +0000 UTC m=+2094.745491571" observedRunningTime="2025-10-01 13:12:43.314797065 +0000 UTC m=+2095.218272693" watchObservedRunningTime="2025-10-01 13:12:43.322020134 +0000 UTC m=+2095.225495722" Oct 01 13:12:48 crc kubenswrapper[4913]: I1001 13:12:48.713552 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:48 crc kubenswrapper[4913]: I1001 13:12:48.714342 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:48 crc kubenswrapper[4913]: I1001 13:12:48.787122 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:49 crc kubenswrapper[4913]: I1001 13:12:49.383199 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:49 crc kubenswrapper[4913]: I1001 13:12:49.427232 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bd6h2"] Oct 01 13:12:51 crc kubenswrapper[4913]: I1001 13:12:51.355876 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bd6h2" podUID="08f26e04-2409-4607-ab98-ab311ea3cc50" containerName="registry-server" containerID="cri-o://5fd264ee30a4ff27338b5e0a18194451cc3fbcf92abe5f41974f48d9b6c971d9" gracePeriod=2 Oct 01 13:12:51 crc kubenswrapper[4913]: I1001 13:12:51.799147 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:51 crc kubenswrapper[4913]: I1001 13:12:51.957558 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f26e04-2409-4607-ab98-ab311ea3cc50-catalog-content\") pod \"08f26e04-2409-4607-ab98-ab311ea3cc50\" (UID: \"08f26e04-2409-4607-ab98-ab311ea3cc50\") " Oct 01 13:12:51 crc kubenswrapper[4913]: I1001 13:12:51.957608 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f26e04-2409-4607-ab98-ab311ea3cc50-utilities\") pod \"08f26e04-2409-4607-ab98-ab311ea3cc50\" (UID: \"08f26e04-2409-4607-ab98-ab311ea3cc50\") " Oct 01 13:12:51 crc kubenswrapper[4913]: I1001 13:12:51.957718 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqtp7\" (UniqueName: \"kubernetes.io/projected/08f26e04-2409-4607-ab98-ab311ea3cc50-kube-api-access-tqtp7\") pod \"08f26e04-2409-4607-ab98-ab311ea3cc50\" (UID: \"08f26e04-2409-4607-ab98-ab311ea3cc50\") " Oct 01 13:12:51 crc kubenswrapper[4913]: I1001 13:12:51.958923 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08f26e04-2409-4607-ab98-ab311ea3cc50-utilities" (OuterVolumeSpecName: "utilities") pod "08f26e04-2409-4607-ab98-ab311ea3cc50" (UID: "08f26e04-2409-4607-ab98-ab311ea3cc50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:12:51 crc kubenswrapper[4913]: I1001 13:12:51.965572 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f26e04-2409-4607-ab98-ab311ea3cc50-kube-api-access-tqtp7" (OuterVolumeSpecName: "kube-api-access-tqtp7") pod "08f26e04-2409-4607-ab98-ab311ea3cc50" (UID: "08f26e04-2409-4607-ab98-ab311ea3cc50"). InnerVolumeSpecName "kube-api-access-tqtp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:12:51 crc kubenswrapper[4913]: I1001 13:12:51.999224 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08f26e04-2409-4607-ab98-ab311ea3cc50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08f26e04-2409-4607-ab98-ab311ea3cc50" (UID: "08f26e04-2409-4607-ab98-ab311ea3cc50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.059420 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f26e04-2409-4607-ab98-ab311ea3cc50-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.059450 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f26e04-2409-4607-ab98-ab311ea3cc50-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.059464 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqtp7\" (UniqueName: \"kubernetes.io/projected/08f26e04-2409-4607-ab98-ab311ea3cc50-kube-api-access-tqtp7\") on node \"crc\" DevicePath \"\"" Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.373563 4913 generic.go:334] "Generic (PLEG): container finished" podID="08f26e04-2409-4607-ab98-ab311ea3cc50" containerID="5fd264ee30a4ff27338b5e0a18194451cc3fbcf92abe5f41974f48d9b6c971d9" exitCode=0 Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.373613 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd6h2" event={"ID":"08f26e04-2409-4607-ab98-ab311ea3cc50","Type":"ContainerDied","Data":"5fd264ee30a4ff27338b5e0a18194451cc3fbcf92abe5f41974f48d9b6c971d9"} Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.373644 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd6h2" event={"ID":"08f26e04-2409-4607-ab98-ab311ea3cc50","Type":"ContainerDied","Data":"abe314ddd91b0ad8a005d91abf3cf1f269e33abc9d987636a291e19f2b6640dd"} Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.373641 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd6h2" Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.373696 4913 scope.go:117] "RemoveContainer" containerID="5fd264ee30a4ff27338b5e0a18194451cc3fbcf92abe5f41974f48d9b6c971d9" Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.404844 4913 scope.go:117] "RemoveContainer" containerID="dcd46fe622442c742865d18efda9d26ead94588f8fad45f3ee1bd27bf2bce414" Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.413947 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bd6h2"] Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.425697 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bd6h2"] Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.433198 4913 scope.go:117] "RemoveContainer" containerID="6d2c2a05561381c0d03ee421a251d486102c59d857717c66aca0175fa0cff3ec" Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.460597 4913 scope.go:117] "RemoveContainer" containerID="5fd264ee30a4ff27338b5e0a18194451cc3fbcf92abe5f41974f48d9b6c971d9" Oct 01 13:12:52 crc kubenswrapper[4913]: E1001 13:12:52.461085 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fd264ee30a4ff27338b5e0a18194451cc3fbcf92abe5f41974f48d9b6c971d9\": container with ID starting with 5fd264ee30a4ff27338b5e0a18194451cc3fbcf92abe5f41974f48d9b6c971d9 not found: ID does not exist" containerID="5fd264ee30a4ff27338b5e0a18194451cc3fbcf92abe5f41974f48d9b6c971d9" Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.461129 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fd264ee30a4ff27338b5e0a18194451cc3fbcf92abe5f41974f48d9b6c971d9"} err="failed to get container status \"5fd264ee30a4ff27338b5e0a18194451cc3fbcf92abe5f41974f48d9b6c971d9\": rpc error: code = NotFound desc = could not find container \"5fd264ee30a4ff27338b5e0a18194451cc3fbcf92abe5f41974f48d9b6c971d9\": container with ID starting with 5fd264ee30a4ff27338b5e0a18194451cc3fbcf92abe5f41974f48d9b6c971d9 not found: ID does not exist" Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.461156 4913 scope.go:117] "RemoveContainer" containerID="dcd46fe622442c742865d18efda9d26ead94588f8fad45f3ee1bd27bf2bce414" Oct 01 13:12:52 crc kubenswrapper[4913]: E1001 13:12:52.461600 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcd46fe622442c742865d18efda9d26ead94588f8fad45f3ee1bd27bf2bce414\": container with ID starting with dcd46fe622442c742865d18efda9d26ead94588f8fad45f3ee1bd27bf2bce414 not found: ID does not exist" containerID="dcd46fe622442c742865d18efda9d26ead94588f8fad45f3ee1bd27bf2bce414" Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.461628 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd46fe622442c742865d18efda9d26ead94588f8fad45f3ee1bd27bf2bce414"} err="failed to get container status \"dcd46fe622442c742865d18efda9d26ead94588f8fad45f3ee1bd27bf2bce414\": rpc error: code = NotFound desc = could not find container \"dcd46fe622442c742865d18efda9d26ead94588f8fad45f3ee1bd27bf2bce414\": container with ID starting with dcd46fe622442c742865d18efda9d26ead94588f8fad45f3ee1bd27bf2bce414 not found: ID does not exist" Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.461646 4913 scope.go:117] "RemoveContainer" containerID="6d2c2a05561381c0d03ee421a251d486102c59d857717c66aca0175fa0cff3ec" Oct 01 13:12:52 crc kubenswrapper[4913]: E1001 13:12:52.461952 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d2c2a05561381c0d03ee421a251d486102c59d857717c66aca0175fa0cff3ec\": container with ID starting with 6d2c2a05561381c0d03ee421a251d486102c59d857717c66aca0175fa0cff3ec not found: ID does not exist" containerID="6d2c2a05561381c0d03ee421a251d486102c59d857717c66aca0175fa0cff3ec" Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.461981 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d2c2a05561381c0d03ee421a251d486102c59d857717c66aca0175fa0cff3ec"} err="failed to get container status \"6d2c2a05561381c0d03ee421a251d486102c59d857717c66aca0175fa0cff3ec\": rpc error: code = NotFound desc = could not find container \"6d2c2a05561381c0d03ee421a251d486102c59d857717c66aca0175fa0cff3ec\": container with ID starting with 6d2c2a05561381c0d03ee421a251d486102c59d857717c66aca0175fa0cff3ec not found: ID does not exist" Oct 01 13:12:52 crc kubenswrapper[4913]: I1001 13:12:52.834768 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08f26e04-2409-4607-ab98-ab311ea3cc50" path="/var/lib/kubelet/pods/08f26e04-2409-4607-ab98-ab311ea3cc50/volumes" Oct 01 13:13:06 crc kubenswrapper[4913]: I1001 13:13:06.604378 4913 scope.go:117] "RemoveContainer" containerID="aec210feb18a07e0b7f6a803740ba7a7b79e94feb989ce02534c98188cf4b9a4" Oct 01 13:13:06 crc kubenswrapper[4913]: I1001 13:13:06.632816 4913 scope.go:117] "RemoveContainer" containerID="1a423254bc50bb196b848503d26c963388ac847bbaa676561d3c490e5ddc5797" Oct 01 13:13:06 crc kubenswrapper[4913]: I1001 13:13:06.673986 4913 scope.go:117] "RemoveContainer" containerID="211f75c1df34f329425c0137b956ac558ca4d1be71e75c8dd45e9e08679a880d" Oct 01 13:13:10 crc kubenswrapper[4913]: I1001 13:13:10.083645 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:13:10 crc kubenswrapper[4913]: I1001 13:13:10.084201 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.228514 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mhmr4"] Oct 01 13:13:29 crc kubenswrapper[4913]: E1001 13:13:29.229422 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f26e04-2409-4607-ab98-ab311ea3cc50" containerName="extract-content" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.229435 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f26e04-2409-4607-ab98-ab311ea3cc50" containerName="extract-content" Oct 01 13:13:29 crc kubenswrapper[4913]: E1001 13:13:29.229449 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f26e04-2409-4607-ab98-ab311ea3cc50" containerName="registry-server" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.229455 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f26e04-2409-4607-ab98-ab311ea3cc50" containerName="registry-server" Oct 01 13:13:29 crc kubenswrapper[4913]: E1001 13:13:29.229471 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f26e04-2409-4607-ab98-ab311ea3cc50" containerName="extract-utilities" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.229478 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f26e04-2409-4607-ab98-ab311ea3cc50" containerName="extract-utilities" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.229647 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="08f26e04-2409-4607-ab98-ab311ea3cc50" containerName="registry-server" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.230824 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.232152 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e105dd-fc0d-4e81-8e8e-1000bb959465-catalog-content\") pod \"redhat-operators-mhmr4\" (UID: \"91e105dd-fc0d-4e81-8e8e-1000bb959465\") " pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.232244 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e105dd-fc0d-4e81-8e8e-1000bb959465-utilities\") pod \"redhat-operators-mhmr4\" (UID: \"91e105dd-fc0d-4e81-8e8e-1000bb959465\") " pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.232292 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjgsr\" (UniqueName: \"kubernetes.io/projected/91e105dd-fc0d-4e81-8e8e-1000bb959465-kube-api-access-sjgsr\") pod \"redhat-operators-mhmr4\" (UID: \"91e105dd-fc0d-4e81-8e8e-1000bb959465\") " pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.244810 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mhmr4"] Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.333783 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e105dd-fc0d-4e81-8e8e-1000bb959465-catalog-content\") pod \"redhat-operators-mhmr4\" (UID: \"91e105dd-fc0d-4e81-8e8e-1000bb959465\") " pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.334243 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e105dd-fc0d-4e81-8e8e-1000bb959465-utilities\") pod \"redhat-operators-mhmr4\" (UID: \"91e105dd-fc0d-4e81-8e8e-1000bb959465\") " pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.334315 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjgsr\" (UniqueName: \"kubernetes.io/projected/91e105dd-fc0d-4e81-8e8e-1000bb959465-kube-api-access-sjgsr\") pod \"redhat-operators-mhmr4\" (UID: \"91e105dd-fc0d-4e81-8e8e-1000bb959465\") " pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.334584 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e105dd-fc0d-4e81-8e8e-1000bb959465-catalog-content\") pod \"redhat-operators-mhmr4\" (UID: \"91e105dd-fc0d-4e81-8e8e-1000bb959465\") " pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.334958 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e105dd-fc0d-4e81-8e8e-1000bb959465-utilities\") pod \"redhat-operators-mhmr4\" (UID: \"91e105dd-fc0d-4e81-8e8e-1000bb959465\") " pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.357945 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjgsr\" (UniqueName: \"kubernetes.io/projected/91e105dd-fc0d-4e81-8e8e-1000bb959465-kube-api-access-sjgsr\") pod \"redhat-operators-mhmr4\" (UID: \"91e105dd-fc0d-4e81-8e8e-1000bb959465\") " pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:29 crc kubenswrapper[4913]: I1001 13:13:29.556913 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:30 crc kubenswrapper[4913]: I1001 13:13:30.036164 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mhmr4"] Oct 01 13:13:30 crc kubenswrapper[4913]: I1001 13:13:30.742060 4913 generic.go:334] "Generic (PLEG): container finished" podID="91e105dd-fc0d-4e81-8e8e-1000bb959465" containerID="f244f32bbee0d9ddf7d4d2093d68516517d6aea41bd79e50effb25614ea98fc4" exitCode=0 Oct 01 13:13:30 crc kubenswrapper[4913]: I1001 13:13:30.742114 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhmr4" event={"ID":"91e105dd-fc0d-4e81-8e8e-1000bb959465","Type":"ContainerDied","Data":"f244f32bbee0d9ddf7d4d2093d68516517d6aea41bd79e50effb25614ea98fc4"} Oct 01 13:13:30 crc kubenswrapper[4913]: I1001 13:13:30.742413 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhmr4" event={"ID":"91e105dd-fc0d-4e81-8e8e-1000bb959465","Type":"ContainerStarted","Data":"04a551d19b81d9969fffa2abf479a64c29898989bde1b9d6b499806b1bd95f2b"} Oct 01 13:13:32 crc kubenswrapper[4913]: I1001 13:13:32.759199 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhmr4" event={"ID":"91e105dd-fc0d-4e81-8e8e-1000bb959465","Type":"ContainerStarted","Data":"7b51a8eb9e3fb72b3abdea9426a2e3343ace5bbbd3ba9a49814ef8eefe27eb63"} Oct 01 13:13:33 crc kubenswrapper[4913]: I1001 13:13:33.771593 4913 generic.go:334] "Generic (PLEG): container finished" podID="91e105dd-fc0d-4e81-8e8e-1000bb959465" containerID="7b51a8eb9e3fb72b3abdea9426a2e3343ace5bbbd3ba9a49814ef8eefe27eb63" exitCode=0 Oct 01 13:13:33 crc kubenswrapper[4913]: I1001 13:13:33.771671 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhmr4" event={"ID":"91e105dd-fc0d-4e81-8e8e-1000bb959465","Type":"ContainerDied","Data":"7b51a8eb9e3fb72b3abdea9426a2e3343ace5bbbd3ba9a49814ef8eefe27eb63"} Oct 01 13:13:34 crc kubenswrapper[4913]: I1001 13:13:34.784695 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhmr4" event={"ID":"91e105dd-fc0d-4e81-8e8e-1000bb959465","Type":"ContainerStarted","Data":"d97a4b1640da9af3c145869592f07da07053617275217a336b83cae9cfd7d4f7"} Oct 01 13:13:34 crc kubenswrapper[4913]: I1001 13:13:34.813936 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mhmr4" podStartSLOduration=2.017426286 podStartE2EDuration="5.81391578s" podCreationTimestamp="2025-10-01 13:13:29 +0000 UTC" firstStartedPulling="2025-10-01 13:13:30.744763146 +0000 UTC m=+2142.648238724" lastFinishedPulling="2025-10-01 13:13:34.5412526 +0000 UTC m=+2146.444728218" observedRunningTime="2025-10-01 13:13:34.809528477 +0000 UTC m=+2146.713004065" watchObservedRunningTime="2025-10-01 13:13:34.81391578 +0000 UTC m=+2146.717391358" Oct 01 13:13:39 crc kubenswrapper[4913]: I1001 13:13:39.557919 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:39 crc kubenswrapper[4913]: I1001 13:13:39.558593 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:40 crc kubenswrapper[4913]: I1001 13:13:40.083829 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:13:40 crc kubenswrapper[4913]: I1001 13:13:40.084107 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:13:40 crc kubenswrapper[4913]: I1001 13:13:40.635747 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mhmr4" podUID="91e105dd-fc0d-4e81-8e8e-1000bb959465" containerName="registry-server" probeResult="failure" output=< Oct 01 13:13:40 crc kubenswrapper[4913]: timeout: failed to connect service ":50051" within 1s Oct 01 13:13:40 crc kubenswrapper[4913]: > Oct 01 13:13:49 crc kubenswrapper[4913]: I1001 13:13:49.615099 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:49 crc kubenswrapper[4913]: I1001 13:13:49.679583 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:49 crc kubenswrapper[4913]: I1001 13:13:49.852118 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mhmr4"] Oct 01 13:13:50 crc kubenswrapper[4913]: I1001 13:13:50.947127 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mhmr4" podUID="91e105dd-fc0d-4e81-8e8e-1000bb959465" containerName="registry-server" containerID="cri-o://d97a4b1640da9af3c145869592f07da07053617275217a336b83cae9cfd7d4f7" gracePeriod=2 Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.461012 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.601562 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e105dd-fc0d-4e81-8e8e-1000bb959465-catalog-content\") pod \"91e105dd-fc0d-4e81-8e8e-1000bb959465\" (UID: \"91e105dd-fc0d-4e81-8e8e-1000bb959465\") " Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.601662 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e105dd-fc0d-4e81-8e8e-1000bb959465-utilities\") pod \"91e105dd-fc0d-4e81-8e8e-1000bb959465\" (UID: \"91e105dd-fc0d-4e81-8e8e-1000bb959465\") " Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.601717 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjgsr\" (UniqueName: \"kubernetes.io/projected/91e105dd-fc0d-4e81-8e8e-1000bb959465-kube-api-access-sjgsr\") pod \"91e105dd-fc0d-4e81-8e8e-1000bb959465\" (UID: \"91e105dd-fc0d-4e81-8e8e-1000bb959465\") " Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.602848 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91e105dd-fc0d-4e81-8e8e-1000bb959465-utilities" (OuterVolumeSpecName: "utilities") pod "91e105dd-fc0d-4e81-8e8e-1000bb959465" (UID: "91e105dd-fc0d-4e81-8e8e-1000bb959465"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.608889 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e105dd-fc0d-4e81-8e8e-1000bb959465-kube-api-access-sjgsr" (OuterVolumeSpecName: "kube-api-access-sjgsr") pod "91e105dd-fc0d-4e81-8e8e-1000bb959465" (UID: "91e105dd-fc0d-4e81-8e8e-1000bb959465"). InnerVolumeSpecName "kube-api-access-sjgsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.703610 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjgsr\" (UniqueName: \"kubernetes.io/projected/91e105dd-fc0d-4e81-8e8e-1000bb959465-kube-api-access-sjgsr\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.703883 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e105dd-fc0d-4e81-8e8e-1000bb959465-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.717153 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91e105dd-fc0d-4e81-8e8e-1000bb959465-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91e105dd-fc0d-4e81-8e8e-1000bb959465" (UID: "91e105dd-fc0d-4e81-8e8e-1000bb959465"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.805831 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e105dd-fc0d-4e81-8e8e-1000bb959465-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.960980 4913 generic.go:334] "Generic (PLEG): container finished" podID="91e105dd-fc0d-4e81-8e8e-1000bb959465" containerID="d97a4b1640da9af3c145869592f07da07053617275217a336b83cae9cfd7d4f7" exitCode=0 Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.961056 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhmr4" event={"ID":"91e105dd-fc0d-4e81-8e8e-1000bb959465","Type":"ContainerDied","Data":"d97a4b1640da9af3c145869592f07da07053617275217a336b83cae9cfd7d4f7"} Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.961143 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhmr4" event={"ID":"91e105dd-fc0d-4e81-8e8e-1000bb959465","Type":"ContainerDied","Data":"04a551d19b81d9969fffa2abf479a64c29898989bde1b9d6b499806b1bd95f2b"} Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.961190 4913 scope.go:117] "RemoveContainer" containerID="d97a4b1640da9af3c145869592f07da07053617275217a336b83cae9cfd7d4f7" Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.962406 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhmr4" Oct 01 13:13:51 crc kubenswrapper[4913]: I1001 13:13:51.999495 4913 scope.go:117] "RemoveContainer" containerID="7b51a8eb9e3fb72b3abdea9426a2e3343ace5bbbd3ba9a49814ef8eefe27eb63" Oct 01 13:13:52 crc kubenswrapper[4913]: I1001 13:13:52.025155 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mhmr4"] Oct 01 13:13:52 crc kubenswrapper[4913]: I1001 13:13:52.031291 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mhmr4"] Oct 01 13:13:52 crc kubenswrapper[4913]: I1001 13:13:52.033604 4913 scope.go:117] "RemoveContainer" containerID="f244f32bbee0d9ddf7d4d2093d68516517d6aea41bd79e50effb25614ea98fc4" Oct 01 13:13:52 crc kubenswrapper[4913]: I1001 13:13:52.082903 4913 scope.go:117] "RemoveContainer" containerID="d97a4b1640da9af3c145869592f07da07053617275217a336b83cae9cfd7d4f7" Oct 01 13:13:52 crc kubenswrapper[4913]: E1001 13:13:52.083551 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d97a4b1640da9af3c145869592f07da07053617275217a336b83cae9cfd7d4f7\": container with ID starting with d97a4b1640da9af3c145869592f07da07053617275217a336b83cae9cfd7d4f7 not found: ID does not exist" containerID="d97a4b1640da9af3c145869592f07da07053617275217a336b83cae9cfd7d4f7" Oct 01 13:13:52 crc kubenswrapper[4913]: I1001 13:13:52.083612 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97a4b1640da9af3c145869592f07da07053617275217a336b83cae9cfd7d4f7"} err="failed to get container status \"d97a4b1640da9af3c145869592f07da07053617275217a336b83cae9cfd7d4f7\": rpc error: code = NotFound desc = could not find container \"d97a4b1640da9af3c145869592f07da07053617275217a336b83cae9cfd7d4f7\": container with ID starting with d97a4b1640da9af3c145869592f07da07053617275217a336b83cae9cfd7d4f7 not found: ID does not exist" Oct 01 13:13:52 crc kubenswrapper[4913]: I1001 13:13:52.083646 4913 scope.go:117] "RemoveContainer" containerID="7b51a8eb9e3fb72b3abdea9426a2e3343ace5bbbd3ba9a49814ef8eefe27eb63" Oct 01 13:13:52 crc kubenswrapper[4913]: E1001 13:13:52.084739 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b51a8eb9e3fb72b3abdea9426a2e3343ace5bbbd3ba9a49814ef8eefe27eb63\": container with ID starting with 7b51a8eb9e3fb72b3abdea9426a2e3343ace5bbbd3ba9a49814ef8eefe27eb63 not found: ID does not exist" containerID="7b51a8eb9e3fb72b3abdea9426a2e3343ace5bbbd3ba9a49814ef8eefe27eb63" Oct 01 13:13:52 crc kubenswrapper[4913]: I1001 13:13:52.084821 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b51a8eb9e3fb72b3abdea9426a2e3343ace5bbbd3ba9a49814ef8eefe27eb63"} err="failed to get container status \"7b51a8eb9e3fb72b3abdea9426a2e3343ace5bbbd3ba9a49814ef8eefe27eb63\": rpc error: code = NotFound desc = could not find container \"7b51a8eb9e3fb72b3abdea9426a2e3343ace5bbbd3ba9a49814ef8eefe27eb63\": container with ID starting with 7b51a8eb9e3fb72b3abdea9426a2e3343ace5bbbd3ba9a49814ef8eefe27eb63 not found: ID does not exist" Oct 01 13:13:52 crc kubenswrapper[4913]: I1001 13:13:52.084905 4913 scope.go:117] "RemoveContainer" containerID="f244f32bbee0d9ddf7d4d2093d68516517d6aea41bd79e50effb25614ea98fc4" Oct 01 13:13:52 crc kubenswrapper[4913]: E1001 13:13:52.085389 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f244f32bbee0d9ddf7d4d2093d68516517d6aea41bd79e50effb25614ea98fc4\": container with ID starting with f244f32bbee0d9ddf7d4d2093d68516517d6aea41bd79e50effb25614ea98fc4 not found: ID does not exist" containerID="f244f32bbee0d9ddf7d4d2093d68516517d6aea41bd79e50effb25614ea98fc4" Oct 01 13:13:52 crc kubenswrapper[4913]: I1001 13:13:52.085419 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f244f32bbee0d9ddf7d4d2093d68516517d6aea41bd79e50effb25614ea98fc4"} err="failed to get container status \"f244f32bbee0d9ddf7d4d2093d68516517d6aea41bd79e50effb25614ea98fc4\": rpc error: code = NotFound desc = could not find container \"f244f32bbee0d9ddf7d4d2093d68516517d6aea41bd79e50effb25614ea98fc4\": container with ID starting with f244f32bbee0d9ddf7d4d2093d68516517d6aea41bd79e50effb25614ea98fc4 not found: ID does not exist" Oct 01 13:13:52 crc kubenswrapper[4913]: I1001 13:13:52.825908 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e105dd-fc0d-4e81-8e8e-1000bb959465" path="/var/lib/kubelet/pods/91e105dd-fc0d-4e81-8e8e-1000bb959465/volumes" Oct 01 13:13:59 crc kubenswrapper[4913]: I1001 13:13:59.030392 4913 generic.go:334] "Generic (PLEG): container finished" podID="615e9e8a-5d4a-410f-9fa9-5e1acfd7df02" containerID="2c1e5f8a8419b505d557b558a0706bbcc7c29f7da6bbb51204c169dace1f180d" exitCode=0 Oct 01 13:13:59 crc kubenswrapper[4913]: I1001 13:13:59.030455 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" event={"ID":"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02","Type":"ContainerDied","Data":"2c1e5f8a8419b505d557b558a0706bbcc7c29f7da6bbb51204c169dace1f180d"} Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.447154 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.487657 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-inventory\") pod \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.487702 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-ssh-key\") pod \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.487802 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8b8n\" (UniqueName: \"kubernetes.io/projected/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-kube-api-access-z8b8n\") pod \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.487889 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-bootstrap-combined-ca-bundle\") pod \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.488785 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-ceph\") pod \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\" (UID: \"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02\") " Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.496076 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-kube-api-access-z8b8n" (OuterVolumeSpecName: "kube-api-access-z8b8n") pod "615e9e8a-5d4a-410f-9fa9-5e1acfd7df02" (UID: "615e9e8a-5d4a-410f-9fa9-5e1acfd7df02"). InnerVolumeSpecName "kube-api-access-z8b8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.511979 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "615e9e8a-5d4a-410f-9fa9-5e1acfd7df02" (UID: "615e9e8a-5d4a-410f-9fa9-5e1acfd7df02"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.514244 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-ceph" (OuterVolumeSpecName: "ceph") pod "615e9e8a-5d4a-410f-9fa9-5e1acfd7df02" (UID: "615e9e8a-5d4a-410f-9fa9-5e1acfd7df02"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.539583 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-inventory" (OuterVolumeSpecName: "inventory") pod "615e9e8a-5d4a-410f-9fa9-5e1acfd7df02" (UID: "615e9e8a-5d4a-410f-9fa9-5e1acfd7df02"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.546046 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "615e9e8a-5d4a-410f-9fa9-5e1acfd7df02" (UID: "615e9e8a-5d4a-410f-9fa9-5e1acfd7df02"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.591460 4913 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.591497 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.591511 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.591525 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:00 crc kubenswrapper[4913]: I1001 13:14:00.591538 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8b8n\" (UniqueName: \"kubernetes.io/projected/615e9e8a-5d4a-410f-9fa9-5e1acfd7df02-kube-api-access-z8b8n\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.053746 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" event={"ID":"615e9e8a-5d4a-410f-9fa9-5e1acfd7df02","Type":"ContainerDied","Data":"4ffcfda2ddd32f22424dad6c1a0e2e6a687560b4fcb32682a603094b1fa3c568"} Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.054037 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ffcfda2ddd32f22424dad6c1a0e2e6a687560b4fcb32682a603094b1fa3c568" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.054099 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.157050 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6"] Oct 01 13:14:01 crc kubenswrapper[4913]: E1001 13:14:01.157518 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e105dd-fc0d-4e81-8e8e-1000bb959465" containerName="extract-utilities" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.157539 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e105dd-fc0d-4e81-8e8e-1000bb959465" containerName="extract-utilities" Oct 01 13:14:01 crc kubenswrapper[4913]: E1001 13:14:01.157562 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615e9e8a-5d4a-410f-9fa9-5e1acfd7df02" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.157571 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="615e9e8a-5d4a-410f-9fa9-5e1acfd7df02" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:14:01 crc kubenswrapper[4913]: E1001 13:14:01.157592 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e105dd-fc0d-4e81-8e8e-1000bb959465" containerName="extract-content" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.157600 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e105dd-fc0d-4e81-8e8e-1000bb959465" containerName="extract-content" Oct 01 13:14:01 crc kubenswrapper[4913]: E1001 13:14:01.157627 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e105dd-fc0d-4e81-8e8e-1000bb959465" containerName="registry-server" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.157634 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e105dd-fc0d-4e81-8e8e-1000bb959465" containerName="registry-server" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.157841 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="615e9e8a-5d4a-410f-9fa9-5e1acfd7df02" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.157862 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e105dd-fc0d-4e81-8e8e-1000bb959465" containerName="registry-server" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.158641 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.162912 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.163033 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.163296 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.163325 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.163554 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.165918 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6"] Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.304387 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.304460 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.304499 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59s8x\" (UniqueName: \"kubernetes.io/projected/e9bedba1-5763-41ea-adc6-f0549d30df4d-kube-api-access-59s8x\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.304543 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.405919 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.405996 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59s8x\" (UniqueName: \"kubernetes.io/projected/e9bedba1-5763-41ea-adc6-f0549d30df4d-kube-api-access-59s8x\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.406042 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.406123 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.410667 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.414364 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.416107 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.425931 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59s8x\" (UniqueName: \"kubernetes.io/projected/e9bedba1-5763-41ea-adc6-f0549d30df4d-kube-api-access-59s8x\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:01 crc kubenswrapper[4913]: I1001 13:14:01.478415 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:02 crc kubenswrapper[4913]: I1001 13:14:02.002620 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6"] Oct 01 13:14:02 crc kubenswrapper[4913]: W1001 13:14:02.006725 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9bedba1_5763_41ea_adc6_f0549d30df4d.slice/crio-d22442058d2f92d12e32f408ecd61d50c5e81dd2c1e189655b36bfe0a0c9d641 WatchSource:0}: Error finding container d22442058d2f92d12e32f408ecd61d50c5e81dd2c1e189655b36bfe0a0c9d641: Status 404 returned error can't find the container with id d22442058d2f92d12e32f408ecd61d50c5e81dd2c1e189655b36bfe0a0c9d641 Oct 01 13:14:02 crc kubenswrapper[4913]: I1001 13:14:02.069637 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" event={"ID":"e9bedba1-5763-41ea-adc6-f0549d30df4d","Type":"ContainerStarted","Data":"d22442058d2f92d12e32f408ecd61d50c5e81dd2c1e189655b36bfe0a0c9d641"} Oct 01 13:14:03 crc kubenswrapper[4913]: I1001 13:14:03.078553 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" event={"ID":"e9bedba1-5763-41ea-adc6-f0549d30df4d","Type":"ContainerStarted","Data":"d66b8d7c218e1496fe326a7134dbb86e3f62c395086ac4463bc73e4bc52d8d05"} Oct 01 13:14:03 crc kubenswrapper[4913]: I1001 13:14:03.095589 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" podStartSLOduration=1.619548639 podStartE2EDuration="2.095574765s" podCreationTimestamp="2025-10-01 13:14:01 +0000 UTC" firstStartedPulling="2025-10-01 13:14:02.008868276 +0000 UTC m=+2173.912343854" lastFinishedPulling="2025-10-01 13:14:02.484894402 +0000 UTC m=+2174.388369980" observedRunningTime="2025-10-01 13:14:03.094645671 +0000 UTC m=+2174.998121279" watchObservedRunningTime="2025-10-01 13:14:03.095574765 +0000 UTC m=+2174.999050343" Oct 01 13:14:10 crc kubenswrapper[4913]: I1001 13:14:10.084437 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:14:10 crc kubenswrapper[4913]: I1001 13:14:10.085214 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:14:10 crc kubenswrapper[4913]: I1001 13:14:10.085305 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 13:14:10 crc kubenswrapper[4913]: I1001 13:14:10.086768 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:14:10 crc kubenswrapper[4913]: I1001 13:14:10.086874 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" gracePeriod=600 Oct 01 13:14:10 crc kubenswrapper[4913]: E1001 13:14:10.225059 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:14:11 crc kubenswrapper[4913]: I1001 13:14:11.198705 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" exitCode=0 Oct 01 13:14:11 crc kubenswrapper[4913]: I1001 13:14:11.198807 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5"} Oct 01 13:14:11 crc kubenswrapper[4913]: I1001 13:14:11.199045 4913 scope.go:117] "RemoveContainer" containerID="374f1e21af7aa5a14f018e46d1950c6956f6712b2205ecee89544e8f1a470d3e" Oct 01 13:14:11 crc kubenswrapper[4913]: I1001 13:14:11.200222 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:14:11 crc kubenswrapper[4913]: E1001 13:14:11.200878 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:14:22 crc kubenswrapper[4913]: I1001 13:14:22.806752 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:14:22 crc kubenswrapper[4913]: E1001 13:14:22.807683 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:14:28 crc kubenswrapper[4913]: I1001 13:14:28.396757 4913 generic.go:334] "Generic (PLEG): container finished" podID="e9bedba1-5763-41ea-adc6-f0549d30df4d" containerID="d66b8d7c218e1496fe326a7134dbb86e3f62c395086ac4463bc73e4bc52d8d05" exitCode=0 Oct 01 13:14:28 crc kubenswrapper[4913]: I1001 13:14:28.396881 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" event={"ID":"e9bedba1-5763-41ea-adc6-f0549d30df4d","Type":"ContainerDied","Data":"d66b8d7c218e1496fe326a7134dbb86e3f62c395086ac4463bc73e4bc52d8d05"} Oct 01 13:14:29 crc kubenswrapper[4913]: I1001 13:14:29.899961 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.065404 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-ceph\") pod \"e9bedba1-5763-41ea-adc6-f0549d30df4d\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.065810 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59s8x\" (UniqueName: \"kubernetes.io/projected/e9bedba1-5763-41ea-adc6-f0549d30df4d-kube-api-access-59s8x\") pod \"e9bedba1-5763-41ea-adc6-f0549d30df4d\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.065858 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-ssh-key\") pod \"e9bedba1-5763-41ea-adc6-f0549d30df4d\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.066042 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-inventory\") pod \"e9bedba1-5763-41ea-adc6-f0549d30df4d\" (UID: \"e9bedba1-5763-41ea-adc6-f0549d30df4d\") " Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.070609 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9bedba1-5763-41ea-adc6-f0549d30df4d-kube-api-access-59s8x" (OuterVolumeSpecName: "kube-api-access-59s8x") pod "e9bedba1-5763-41ea-adc6-f0549d30df4d" (UID: "e9bedba1-5763-41ea-adc6-f0549d30df4d"). InnerVolumeSpecName "kube-api-access-59s8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.071800 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-ceph" (OuterVolumeSpecName: "ceph") pod "e9bedba1-5763-41ea-adc6-f0549d30df4d" (UID: "e9bedba1-5763-41ea-adc6-f0549d30df4d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.090767 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e9bedba1-5763-41ea-adc6-f0549d30df4d" (UID: "e9bedba1-5763-41ea-adc6-f0549d30df4d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.091523 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-inventory" (OuterVolumeSpecName: "inventory") pod "e9bedba1-5763-41ea-adc6-f0549d30df4d" (UID: "e9bedba1-5763-41ea-adc6-f0549d30df4d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.168120 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.168159 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59s8x\" (UniqueName: \"kubernetes.io/projected/e9bedba1-5763-41ea-adc6-f0549d30df4d-kube-api-access-59s8x\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.168173 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.168184 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9bedba1-5763-41ea-adc6-f0549d30df4d-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.419292 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" event={"ID":"e9bedba1-5763-41ea-adc6-f0549d30df4d","Type":"ContainerDied","Data":"d22442058d2f92d12e32f408ecd61d50c5e81dd2c1e189655b36bfe0a0c9d641"} Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.420795 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d22442058d2f92d12e32f408ecd61d50c5e81dd2c1e189655b36bfe0a0c9d641" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.420604 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.536081 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27"] Oct 01 13:14:30 crc kubenswrapper[4913]: E1001 13:14:30.536799 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9bedba1-5763-41ea-adc6-f0549d30df4d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.536899 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9bedba1-5763-41ea-adc6-f0549d30df4d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.537239 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9bedba1-5763-41ea-adc6-f0549d30df4d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.538031 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.540746 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.541024 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.541168 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.542377 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.542757 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.549823 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27"] Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.576350 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q4g27\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.576476 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9txlc\" (UniqueName: \"kubernetes.io/projected/01e03618-1c33-4ee7-8b54-c07fef3946e2-kube-api-access-9txlc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q4g27\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.576565 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q4g27\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.576702 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q4g27\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.677809 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q4g27\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.677943 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q4g27\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.677986 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9txlc\" (UniqueName: \"kubernetes.io/projected/01e03618-1c33-4ee7-8b54-c07fef3946e2-kube-api-access-9txlc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q4g27\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.678027 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q4g27\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.683897 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q4g27\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.684494 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q4g27\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.685424 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q4g27\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.698929 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9txlc\" (UniqueName: \"kubernetes.io/projected/01e03618-1c33-4ee7-8b54-c07fef3946e2-kube-api-access-9txlc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q4g27\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:30 crc kubenswrapper[4913]: I1001 13:14:30.881709 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:31 crc kubenswrapper[4913]: I1001 13:14:31.392838 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27"] Oct 01 13:14:31 crc kubenswrapper[4913]: I1001 13:14:31.429003 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" event={"ID":"01e03618-1c33-4ee7-8b54-c07fef3946e2","Type":"ContainerStarted","Data":"07869260cf71a12726ec3c1f8995733570ba83ff1e8ccf25f22acf6401c1ebb4"} Oct 01 13:14:32 crc kubenswrapper[4913]: I1001 13:14:32.441318 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" event={"ID":"01e03618-1c33-4ee7-8b54-c07fef3946e2","Type":"ContainerStarted","Data":"c29e2dd4e9541008905082415f1f1d71ce098c892bd16b1a0c6527c7020d639d"} Oct 01 13:14:32 crc kubenswrapper[4913]: I1001 13:14:32.459045 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" podStartSLOduration=1.93481293 podStartE2EDuration="2.459026761s" podCreationTimestamp="2025-10-01 13:14:30 +0000 UTC" firstStartedPulling="2025-10-01 13:14:31.403095053 +0000 UTC m=+2203.306570631" lastFinishedPulling="2025-10-01 13:14:31.927308844 +0000 UTC m=+2203.830784462" observedRunningTime="2025-10-01 13:14:32.458924408 +0000 UTC m=+2204.362400026" watchObservedRunningTime="2025-10-01 13:14:32.459026761 +0000 UTC m=+2204.362502339" Oct 01 13:14:36 crc kubenswrapper[4913]: I1001 13:14:36.807562 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:14:36 crc kubenswrapper[4913]: E1001 13:14:36.808853 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:14:37 crc kubenswrapper[4913]: I1001 13:14:37.481444 4913 generic.go:334] "Generic (PLEG): container finished" podID="01e03618-1c33-4ee7-8b54-c07fef3946e2" containerID="c29e2dd4e9541008905082415f1f1d71ce098c892bd16b1a0c6527c7020d639d" exitCode=0 Oct 01 13:14:37 crc kubenswrapper[4913]: I1001 13:14:37.481570 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" event={"ID":"01e03618-1c33-4ee7-8b54-c07fef3946e2","Type":"ContainerDied","Data":"c29e2dd4e9541008905082415f1f1d71ce098c892bd16b1a0c6527c7020d639d"} Oct 01 13:14:38 crc kubenswrapper[4913]: I1001 13:14:38.942642 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.139084 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-inventory\") pod \"01e03618-1c33-4ee7-8b54-c07fef3946e2\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.139157 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9txlc\" (UniqueName: \"kubernetes.io/projected/01e03618-1c33-4ee7-8b54-c07fef3946e2-kube-api-access-9txlc\") pod \"01e03618-1c33-4ee7-8b54-c07fef3946e2\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.139263 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-ceph\") pod \"01e03618-1c33-4ee7-8b54-c07fef3946e2\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.139426 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-ssh-key\") pod \"01e03618-1c33-4ee7-8b54-c07fef3946e2\" (UID: \"01e03618-1c33-4ee7-8b54-c07fef3946e2\") " Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.145115 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-ceph" (OuterVolumeSpecName: "ceph") pod "01e03618-1c33-4ee7-8b54-c07fef3946e2" (UID: "01e03618-1c33-4ee7-8b54-c07fef3946e2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.148436 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e03618-1c33-4ee7-8b54-c07fef3946e2-kube-api-access-9txlc" (OuterVolumeSpecName: "kube-api-access-9txlc") pod "01e03618-1c33-4ee7-8b54-c07fef3946e2" (UID: "01e03618-1c33-4ee7-8b54-c07fef3946e2"). InnerVolumeSpecName "kube-api-access-9txlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.196139 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-inventory" (OuterVolumeSpecName: "inventory") pod "01e03618-1c33-4ee7-8b54-c07fef3946e2" (UID: "01e03618-1c33-4ee7-8b54-c07fef3946e2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.207305 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "01e03618-1c33-4ee7-8b54-c07fef3946e2" (UID: "01e03618-1c33-4ee7-8b54-c07fef3946e2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.242352 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.242449 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.242513 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e03618-1c33-4ee7-8b54-c07fef3946e2-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.242536 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9txlc\" (UniqueName: \"kubernetes.io/projected/01e03618-1c33-4ee7-8b54-c07fef3946e2-kube-api-access-9txlc\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.502052 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" event={"ID":"01e03618-1c33-4ee7-8b54-c07fef3946e2","Type":"ContainerDied","Data":"07869260cf71a12726ec3c1f8995733570ba83ff1e8ccf25f22acf6401c1ebb4"} Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.502111 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07869260cf71a12726ec3c1f8995733570ba83ff1e8ccf25f22acf6401c1ebb4" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.502159 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q4g27" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.660461 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft"] Oct 01 13:14:39 crc kubenswrapper[4913]: E1001 13:14:39.661225 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e03618-1c33-4ee7-8b54-c07fef3946e2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.661247 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e03618-1c33-4ee7-8b54-c07fef3946e2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.661470 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e03618-1c33-4ee7-8b54-c07fef3946e2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.662239 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.664500 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.664686 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.664894 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.666279 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.666378 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.671916 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft"] Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.854723 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qjcp\" (UniqueName: \"kubernetes.io/projected/daa71a8f-cb1b-4e0a-afaa-906bc0408723-kube-api-access-9qjcp\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m58ft\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.854998 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m58ft\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.855423 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m58ft\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.855671 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m58ft\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.958913 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qjcp\" (UniqueName: \"kubernetes.io/projected/daa71a8f-cb1b-4e0a-afaa-906bc0408723-kube-api-access-9qjcp\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m58ft\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.959632 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m58ft\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.959846 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m58ft\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.959974 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m58ft\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.966242 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m58ft\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.966405 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m58ft\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.969522 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m58ft\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.982250 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qjcp\" (UniqueName: \"kubernetes.io/projected/daa71a8f-cb1b-4e0a-afaa-906bc0408723-kube-api-access-9qjcp\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m58ft\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:14:39 crc kubenswrapper[4913]: I1001 13:14:39.986726 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:14:40 crc kubenswrapper[4913]: I1001 13:14:40.509117 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft"] Oct 01 13:14:41 crc kubenswrapper[4913]: I1001 13:14:41.526414 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" event={"ID":"daa71a8f-cb1b-4e0a-afaa-906bc0408723","Type":"ContainerStarted","Data":"dfd762c60c13820e549b3dafebd0b35f7a441ebcc96f279ed7bb74d5a426841c"} Oct 01 13:14:41 crc kubenswrapper[4913]: I1001 13:14:41.526955 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" event={"ID":"daa71a8f-cb1b-4e0a-afaa-906bc0408723","Type":"ContainerStarted","Data":"fd51cb84892c87b7f6b40588990ac6a648905199d1fd23df9451f7092ef99b05"} Oct 01 13:14:41 crc kubenswrapper[4913]: I1001 13:14:41.558498 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" podStartSLOduration=2.146563473 podStartE2EDuration="2.55847819s" podCreationTimestamp="2025-10-01 13:14:39 +0000 UTC" firstStartedPulling="2025-10-01 13:14:40.512984384 +0000 UTC m=+2212.416459962" lastFinishedPulling="2025-10-01 13:14:40.924899101 +0000 UTC m=+2212.828374679" observedRunningTime="2025-10-01 13:14:41.54810982 +0000 UTC m=+2213.451585428" watchObservedRunningTime="2025-10-01 13:14:41.55847819 +0000 UTC m=+2213.461953768" Oct 01 13:14:48 crc kubenswrapper[4913]: I1001 13:14:48.812659 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:14:48 crc kubenswrapper[4913]: E1001 13:14:48.813723 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:14:59 crc kubenswrapper[4913]: I1001 13:14:59.807126 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:14:59 crc kubenswrapper[4913]: E1001 13:14:59.807993 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.150607 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj"] Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.152533 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.156410 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.156763 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.163607 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj"] Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.250145 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02fddc88-0d2f-4339-acc7-b2606d785b76-secret-volume\") pod \"collect-profiles-29322075-c6nwj\" (UID: \"02fddc88-0d2f-4339-acc7-b2606d785b76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.250218 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02fddc88-0d2f-4339-acc7-b2606d785b76-config-volume\") pod \"collect-profiles-29322075-c6nwj\" (UID: \"02fddc88-0d2f-4339-acc7-b2606d785b76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.250259 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxstk\" (UniqueName: \"kubernetes.io/projected/02fddc88-0d2f-4339-acc7-b2606d785b76-kube-api-access-cxstk\") pod \"collect-profiles-29322075-c6nwj\" (UID: \"02fddc88-0d2f-4339-acc7-b2606d785b76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.352128 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxstk\" (UniqueName: \"kubernetes.io/projected/02fddc88-0d2f-4339-acc7-b2606d785b76-kube-api-access-cxstk\") pod \"collect-profiles-29322075-c6nwj\" (UID: \"02fddc88-0d2f-4339-acc7-b2606d785b76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.352549 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02fddc88-0d2f-4339-acc7-b2606d785b76-secret-volume\") pod \"collect-profiles-29322075-c6nwj\" (UID: \"02fddc88-0d2f-4339-acc7-b2606d785b76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.352701 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02fddc88-0d2f-4339-acc7-b2606d785b76-config-volume\") pod \"collect-profiles-29322075-c6nwj\" (UID: \"02fddc88-0d2f-4339-acc7-b2606d785b76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.354083 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02fddc88-0d2f-4339-acc7-b2606d785b76-config-volume\") pod \"collect-profiles-29322075-c6nwj\" (UID: \"02fddc88-0d2f-4339-acc7-b2606d785b76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.358484 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02fddc88-0d2f-4339-acc7-b2606d785b76-secret-volume\") pod \"collect-profiles-29322075-c6nwj\" (UID: \"02fddc88-0d2f-4339-acc7-b2606d785b76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.372035 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxstk\" (UniqueName: \"kubernetes.io/projected/02fddc88-0d2f-4339-acc7-b2606d785b76-kube-api-access-cxstk\") pod \"collect-profiles-29322075-c6nwj\" (UID: \"02fddc88-0d2f-4339-acc7-b2606d785b76\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" Oct 01 13:15:00 crc kubenswrapper[4913]: I1001 13:15:00.491444 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" Oct 01 13:15:01 crc kubenswrapper[4913]: W1001 13:15:01.015225 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02fddc88_0d2f_4339_acc7_b2606d785b76.slice/crio-cf5fd5092be31d70fce0d46881ef2ccb92e468a1a4b67194712c3bbe678d4135 WatchSource:0}: Error finding container cf5fd5092be31d70fce0d46881ef2ccb92e468a1a4b67194712c3bbe678d4135: Status 404 returned error can't find the container with id cf5fd5092be31d70fce0d46881ef2ccb92e468a1a4b67194712c3bbe678d4135 Oct 01 13:15:01 crc kubenswrapper[4913]: I1001 13:15:01.016834 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj"] Oct 01 13:15:01 crc kubenswrapper[4913]: I1001 13:15:01.731041 4913 generic.go:334] "Generic (PLEG): container finished" podID="02fddc88-0d2f-4339-acc7-b2606d785b76" containerID="a6e30bc4da301abfb999bc669a79e68e7d6016a20833712e66828bd2b8b85923" exitCode=0 Oct 01 13:15:01 crc kubenswrapper[4913]: I1001 13:15:01.731094 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" event={"ID":"02fddc88-0d2f-4339-acc7-b2606d785b76","Type":"ContainerDied","Data":"a6e30bc4da301abfb999bc669a79e68e7d6016a20833712e66828bd2b8b85923"} Oct 01 13:15:01 crc kubenswrapper[4913]: I1001 13:15:01.731349 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" event={"ID":"02fddc88-0d2f-4339-acc7-b2606d785b76","Type":"ContainerStarted","Data":"cf5fd5092be31d70fce0d46881ef2ccb92e468a1a4b67194712c3bbe678d4135"} Oct 01 13:15:03 crc kubenswrapper[4913]: I1001 13:15:03.108614 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" Oct 01 13:15:03 crc kubenswrapper[4913]: I1001 13:15:03.210695 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02fddc88-0d2f-4339-acc7-b2606d785b76-secret-volume\") pod \"02fddc88-0d2f-4339-acc7-b2606d785b76\" (UID: \"02fddc88-0d2f-4339-acc7-b2606d785b76\") " Oct 01 13:15:03 crc kubenswrapper[4913]: I1001 13:15:03.210919 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxstk\" (UniqueName: \"kubernetes.io/projected/02fddc88-0d2f-4339-acc7-b2606d785b76-kube-api-access-cxstk\") pod \"02fddc88-0d2f-4339-acc7-b2606d785b76\" (UID: \"02fddc88-0d2f-4339-acc7-b2606d785b76\") " Oct 01 13:15:03 crc kubenswrapper[4913]: I1001 13:15:03.211051 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02fddc88-0d2f-4339-acc7-b2606d785b76-config-volume\") pod \"02fddc88-0d2f-4339-acc7-b2606d785b76\" (UID: \"02fddc88-0d2f-4339-acc7-b2606d785b76\") " Oct 01 13:15:03 crc kubenswrapper[4913]: I1001 13:15:03.211989 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02fddc88-0d2f-4339-acc7-b2606d785b76-config-volume" (OuterVolumeSpecName: "config-volume") pod "02fddc88-0d2f-4339-acc7-b2606d785b76" (UID: "02fddc88-0d2f-4339-acc7-b2606d785b76"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:15:03 crc kubenswrapper[4913]: I1001 13:15:03.216482 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02fddc88-0d2f-4339-acc7-b2606d785b76-kube-api-access-cxstk" (OuterVolumeSpecName: "kube-api-access-cxstk") pod "02fddc88-0d2f-4339-acc7-b2606d785b76" (UID: "02fddc88-0d2f-4339-acc7-b2606d785b76"). InnerVolumeSpecName "kube-api-access-cxstk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:03 crc kubenswrapper[4913]: I1001 13:15:03.217153 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02fddc88-0d2f-4339-acc7-b2606d785b76-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "02fddc88-0d2f-4339-acc7-b2606d785b76" (UID: "02fddc88-0d2f-4339-acc7-b2606d785b76"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:03 crc kubenswrapper[4913]: I1001 13:15:03.313863 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxstk\" (UniqueName: \"kubernetes.io/projected/02fddc88-0d2f-4339-acc7-b2606d785b76-kube-api-access-cxstk\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:03 crc kubenswrapper[4913]: I1001 13:15:03.314331 4913 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02fddc88-0d2f-4339-acc7-b2606d785b76-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:03 crc kubenswrapper[4913]: I1001 13:15:03.314354 4913 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02fddc88-0d2f-4339-acc7-b2606d785b76-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:03 crc kubenswrapper[4913]: I1001 13:15:03.770717 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" event={"ID":"02fddc88-0d2f-4339-acc7-b2606d785b76","Type":"ContainerDied","Data":"cf5fd5092be31d70fce0d46881ef2ccb92e468a1a4b67194712c3bbe678d4135"} Oct 01 13:15:03 crc kubenswrapper[4913]: I1001 13:15:03.770780 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf5fd5092be31d70fce0d46881ef2ccb92e468a1a4b67194712c3bbe678d4135" Oct 01 13:15:03 crc kubenswrapper[4913]: I1001 13:15:03.770831 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj" Oct 01 13:15:04 crc kubenswrapper[4913]: I1001 13:15:04.202605 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc"] Oct 01 13:15:04 crc kubenswrapper[4913]: I1001 13:15:04.210279 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322030-jlrxc"] Oct 01 13:15:04 crc kubenswrapper[4913]: I1001 13:15:04.823057 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6fd5584-6878-4be8-83bb-f61003df2639" path="/var/lib/kubelet/pods/e6fd5584-6878-4be8-83bb-f61003df2639/volumes" Oct 01 13:15:06 crc kubenswrapper[4913]: I1001 13:15:06.884384 4913 scope.go:117] "RemoveContainer" containerID="1137a411f44765a5a75fe26e2e6736875a8cc4f16724a24f4b6b5191f717df47" Oct 01 13:15:12 crc kubenswrapper[4913]: I1001 13:15:12.807617 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:15:12 crc kubenswrapper[4913]: E1001 13:15:12.808545 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:15:17 crc kubenswrapper[4913]: I1001 13:15:17.917150 4913 generic.go:334] "Generic (PLEG): container finished" podID="daa71a8f-cb1b-4e0a-afaa-906bc0408723" containerID="dfd762c60c13820e549b3dafebd0b35f7a441ebcc96f279ed7bb74d5a426841c" exitCode=0 Oct 01 13:15:17 crc kubenswrapper[4913]: I1001 13:15:17.917687 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" event={"ID":"daa71a8f-cb1b-4e0a-afaa-906bc0408723","Type":"ContainerDied","Data":"dfd762c60c13820e549b3dafebd0b35f7a441ebcc96f279ed7bb74d5a426841c"} Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.326026 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.436907 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-ssh-key\") pod \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.437045 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qjcp\" (UniqueName: \"kubernetes.io/projected/daa71a8f-cb1b-4e0a-afaa-906bc0408723-kube-api-access-9qjcp\") pod \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.437076 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-inventory\") pod \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.437136 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-ceph\") pod \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\" (UID: \"daa71a8f-cb1b-4e0a-afaa-906bc0408723\") " Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.442929 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daa71a8f-cb1b-4e0a-afaa-906bc0408723-kube-api-access-9qjcp" (OuterVolumeSpecName: "kube-api-access-9qjcp") pod "daa71a8f-cb1b-4e0a-afaa-906bc0408723" (UID: "daa71a8f-cb1b-4e0a-afaa-906bc0408723"). InnerVolumeSpecName "kube-api-access-9qjcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.442992 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-ceph" (OuterVolumeSpecName: "ceph") pod "daa71a8f-cb1b-4e0a-afaa-906bc0408723" (UID: "daa71a8f-cb1b-4e0a-afaa-906bc0408723"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.467045 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-inventory" (OuterVolumeSpecName: "inventory") pod "daa71a8f-cb1b-4e0a-afaa-906bc0408723" (UID: "daa71a8f-cb1b-4e0a-afaa-906bc0408723"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.471198 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "daa71a8f-cb1b-4e0a-afaa-906bc0408723" (UID: "daa71a8f-cb1b-4e0a-afaa-906bc0408723"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.539445 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qjcp\" (UniqueName: \"kubernetes.io/projected/daa71a8f-cb1b-4e0a-afaa-906bc0408723-kube-api-access-9qjcp\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.539489 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.539508 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.539526 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daa71a8f-cb1b-4e0a-afaa-906bc0408723-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.936308 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" event={"ID":"daa71a8f-cb1b-4e0a-afaa-906bc0408723","Type":"ContainerDied","Data":"fd51cb84892c87b7f6b40588990ac6a648905199d1fd23df9451f7092ef99b05"} Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.936773 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd51cb84892c87b7f6b40588990ac6a648905199d1fd23df9451f7092ef99b05" Oct 01 13:15:19 crc kubenswrapper[4913]: I1001 13:15:19.936370 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m58ft" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.055149 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn"] Oct 01 13:15:20 crc kubenswrapper[4913]: E1001 13:15:20.055807 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02fddc88-0d2f-4339-acc7-b2606d785b76" containerName="collect-profiles" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.055833 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="02fddc88-0d2f-4339-acc7-b2606d785b76" containerName="collect-profiles" Oct 01 13:15:20 crc kubenswrapper[4913]: E1001 13:15:20.055860 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa71a8f-cb1b-4e0a-afaa-906bc0408723" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.055871 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa71a8f-cb1b-4e0a-afaa-906bc0408723" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.056225 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="02fddc88-0d2f-4339-acc7-b2606d785b76" containerName="collect-profiles" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.056247 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="daa71a8f-cb1b-4e0a-afaa-906bc0408723" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.057065 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.058969 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.059012 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.058970 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.059339 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.059730 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.063683 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn"] Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.152587 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2868d\" (UniqueName: \"kubernetes.io/projected/58cec94b-852c-4959-a24b-04d7f83fc246-kube-api-access-2868d\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.152632 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.152714 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.152944 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.256319 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.256704 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.258713 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2868d\" (UniqueName: \"kubernetes.io/projected/58cec94b-852c-4959-a24b-04d7f83fc246-kube-api-access-2868d\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.258824 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.262030 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.264158 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.269889 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.285447 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2868d\" (UniqueName: \"kubernetes.io/projected/58cec94b-852c-4959-a24b-04d7f83fc246-kube-api-access-2868d\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.389751 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.870062 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn"] Oct 01 13:15:20 crc kubenswrapper[4913]: I1001 13:15:20.944123 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" event={"ID":"58cec94b-852c-4959-a24b-04d7f83fc246","Type":"ContainerStarted","Data":"c8af174849b6692bc79fad67883366469523fc32eeea79df117b11e4fd35f55b"} Oct 01 13:15:22 crc kubenswrapper[4913]: I1001 13:15:22.969554 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" event={"ID":"58cec94b-852c-4959-a24b-04d7f83fc246","Type":"ContainerStarted","Data":"57a98d493236fbd6b5174f2c0bb21be49586331f6724124d5af29901aab14d19"} Oct 01 13:15:23 crc kubenswrapper[4913]: I1001 13:15:23.001557 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" podStartSLOduration=2.157959006 podStartE2EDuration="3.001532453s" podCreationTimestamp="2025-10-01 13:15:20 +0000 UTC" firstStartedPulling="2025-10-01 13:15:20.87744458 +0000 UTC m=+2252.780920158" lastFinishedPulling="2025-10-01 13:15:21.721017987 +0000 UTC m=+2253.624493605" observedRunningTime="2025-10-01 13:15:22.99105723 +0000 UTC m=+2254.894532878" watchObservedRunningTime="2025-10-01 13:15:23.001532453 +0000 UTC m=+2254.905008071" Oct 01 13:15:23 crc kubenswrapper[4913]: I1001 13:15:23.807618 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:15:23 crc kubenswrapper[4913]: E1001 13:15:23.807904 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:15:26 crc kubenswrapper[4913]: I1001 13:15:26.004507 4913 generic.go:334] "Generic (PLEG): container finished" podID="58cec94b-852c-4959-a24b-04d7f83fc246" containerID="57a98d493236fbd6b5174f2c0bb21be49586331f6724124d5af29901aab14d19" exitCode=0 Oct 01 13:15:26 crc kubenswrapper[4913]: I1001 13:15:26.004872 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" event={"ID":"58cec94b-852c-4959-a24b-04d7f83fc246","Type":"ContainerDied","Data":"57a98d493236fbd6b5174f2c0bb21be49586331f6724124d5af29901aab14d19"} Oct 01 13:15:27 crc kubenswrapper[4913]: I1001 13:15:27.449024 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:27 crc kubenswrapper[4913]: I1001 13:15:27.547014 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-ssh-key\") pod \"58cec94b-852c-4959-a24b-04d7f83fc246\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " Oct 01 13:15:27 crc kubenswrapper[4913]: I1001 13:15:27.547058 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-inventory\") pod \"58cec94b-852c-4959-a24b-04d7f83fc246\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " Oct 01 13:15:27 crc kubenswrapper[4913]: I1001 13:15:27.547097 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2868d\" (UniqueName: \"kubernetes.io/projected/58cec94b-852c-4959-a24b-04d7f83fc246-kube-api-access-2868d\") pod \"58cec94b-852c-4959-a24b-04d7f83fc246\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " Oct 01 13:15:27 crc kubenswrapper[4913]: I1001 13:15:27.547214 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-ceph\") pod \"58cec94b-852c-4959-a24b-04d7f83fc246\" (UID: \"58cec94b-852c-4959-a24b-04d7f83fc246\") " Oct 01 13:15:27 crc kubenswrapper[4913]: I1001 13:15:27.552423 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58cec94b-852c-4959-a24b-04d7f83fc246-kube-api-access-2868d" (OuterVolumeSpecName: "kube-api-access-2868d") pod "58cec94b-852c-4959-a24b-04d7f83fc246" (UID: "58cec94b-852c-4959-a24b-04d7f83fc246"). InnerVolumeSpecName "kube-api-access-2868d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:27 crc kubenswrapper[4913]: I1001 13:15:27.556494 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-ceph" (OuterVolumeSpecName: "ceph") pod "58cec94b-852c-4959-a24b-04d7f83fc246" (UID: "58cec94b-852c-4959-a24b-04d7f83fc246"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:27 crc kubenswrapper[4913]: I1001 13:15:27.584646 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-inventory" (OuterVolumeSpecName: "inventory") pod "58cec94b-852c-4959-a24b-04d7f83fc246" (UID: "58cec94b-852c-4959-a24b-04d7f83fc246"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:27 crc kubenswrapper[4913]: I1001 13:15:27.610576 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "58cec94b-852c-4959-a24b-04d7f83fc246" (UID: "58cec94b-852c-4959-a24b-04d7f83fc246"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:27 crc kubenswrapper[4913]: I1001 13:15:27.649786 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:27 crc kubenswrapper[4913]: I1001 13:15:27.649827 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:27 crc kubenswrapper[4913]: I1001 13:15:27.649843 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58cec94b-852c-4959-a24b-04d7f83fc246-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:27 crc kubenswrapper[4913]: I1001 13:15:27.649856 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2868d\" (UniqueName: \"kubernetes.io/projected/58cec94b-852c-4959-a24b-04d7f83fc246-kube-api-access-2868d\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.025696 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" event={"ID":"58cec94b-852c-4959-a24b-04d7f83fc246","Type":"ContainerDied","Data":"c8af174849b6692bc79fad67883366469523fc32eeea79df117b11e4fd35f55b"} Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.026036 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8af174849b6692bc79fad67883366469523fc32eeea79df117b11e4fd35f55b" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.025755 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.127143 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz"] Oct 01 13:15:28 crc kubenswrapper[4913]: E1001 13:15:28.127546 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cec94b-852c-4959-a24b-04d7f83fc246" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.127559 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cec94b-852c-4959-a24b-04d7f83fc246" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.127726 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="58cec94b-852c-4959-a24b-04d7f83fc246" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.128330 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.131138 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.132278 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.132705 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.133079 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.135801 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.141635 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz"] Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.261656 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.262106 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.262342 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84rp2\" (UniqueName: \"kubernetes.io/projected/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-kube-api-access-84rp2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.262505 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.364993 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.365188 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84rp2\" (UniqueName: \"kubernetes.io/projected/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-kube-api-access-84rp2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.365701 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.367950 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.369951 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.370094 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.371696 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.383253 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84rp2\" (UniqueName: \"kubernetes.io/projected/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-kube-api-access-84rp2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:15:28 crc kubenswrapper[4913]: I1001 13:15:28.457122 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:15:29 crc kubenswrapper[4913]: I1001 13:15:29.001899 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz"] Oct 01 13:15:29 crc kubenswrapper[4913]: W1001 13:15:29.006643 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5ce8f21_34e2_4e8e_ace0_59c635454fcc.slice/crio-c4ec4301d1259ec8722f28a170cd2adc3c0b50c38df281cd8e1936b480f6934b WatchSource:0}: Error finding container c4ec4301d1259ec8722f28a170cd2adc3c0b50c38df281cd8e1936b480f6934b: Status 404 returned error can't find the container with id c4ec4301d1259ec8722f28a170cd2adc3c0b50c38df281cd8e1936b480f6934b Oct 01 13:15:29 crc kubenswrapper[4913]: I1001 13:15:29.037472 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" event={"ID":"d5ce8f21-34e2-4e8e-ace0-59c635454fcc","Type":"ContainerStarted","Data":"c4ec4301d1259ec8722f28a170cd2adc3c0b50c38df281cd8e1936b480f6934b"} Oct 01 13:15:30 crc kubenswrapper[4913]: I1001 13:15:30.052036 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" event={"ID":"d5ce8f21-34e2-4e8e-ace0-59c635454fcc","Type":"ContainerStarted","Data":"ab425929601e24701369bed8befe72ef5fc3d9c1bd3d87a68c5bcf05cc610fcc"} Oct 01 13:15:30 crc kubenswrapper[4913]: I1001 13:15:30.084181 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" podStartSLOduration=1.641931595 podStartE2EDuration="2.084042274s" podCreationTimestamp="2025-10-01 13:15:28 +0000 UTC" firstStartedPulling="2025-10-01 13:15:29.010156801 +0000 UTC m=+2260.913632379" lastFinishedPulling="2025-10-01 13:15:29.45226746 +0000 UTC m=+2261.355743058" observedRunningTime="2025-10-01 13:15:30.066555853 +0000 UTC m=+2261.970031441" watchObservedRunningTime="2025-10-01 13:15:30.084042274 +0000 UTC m=+2261.987517892" Oct 01 13:15:37 crc kubenswrapper[4913]: I1001 13:15:37.806495 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:15:37 crc kubenswrapper[4913]: E1001 13:15:37.807388 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:15:48 crc kubenswrapper[4913]: I1001 13:15:48.815881 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:15:48 crc kubenswrapper[4913]: E1001 13:15:48.817049 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:16:01 crc kubenswrapper[4913]: I1001 13:16:01.806741 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:16:01 crc kubenswrapper[4913]: E1001 13:16:01.807593 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:16:12 crc kubenswrapper[4913]: I1001 13:16:12.806883 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:16:12 crc kubenswrapper[4913]: E1001 13:16:12.809200 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:16:16 crc kubenswrapper[4913]: I1001 13:16:16.468470 4913 generic.go:334] "Generic (PLEG): container finished" podID="d5ce8f21-34e2-4e8e-ace0-59c635454fcc" containerID="ab425929601e24701369bed8befe72ef5fc3d9c1bd3d87a68c5bcf05cc610fcc" exitCode=0 Oct 01 13:16:16 crc kubenswrapper[4913]: I1001 13:16:16.468617 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" event={"ID":"d5ce8f21-34e2-4e8e-ace0-59c635454fcc","Type":"ContainerDied","Data":"ab425929601e24701369bed8befe72ef5fc3d9c1bd3d87a68c5bcf05cc610fcc"} Oct 01 13:16:17 crc kubenswrapper[4913]: I1001 13:16:17.910926 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.028750 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-ssh-key\") pod \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.028875 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-ceph\") pod \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.028963 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-inventory\") pod \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.029060 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84rp2\" (UniqueName: \"kubernetes.io/projected/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-kube-api-access-84rp2\") pod \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\" (UID: \"d5ce8f21-34e2-4e8e-ace0-59c635454fcc\") " Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.034975 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-kube-api-access-84rp2" (OuterVolumeSpecName: "kube-api-access-84rp2") pod "d5ce8f21-34e2-4e8e-ace0-59c635454fcc" (UID: "d5ce8f21-34e2-4e8e-ace0-59c635454fcc"). InnerVolumeSpecName "kube-api-access-84rp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.035116 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-ceph" (OuterVolumeSpecName: "ceph") pod "d5ce8f21-34e2-4e8e-ace0-59c635454fcc" (UID: "d5ce8f21-34e2-4e8e-ace0-59c635454fcc"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.057090 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-inventory" (OuterVolumeSpecName: "inventory") pod "d5ce8f21-34e2-4e8e-ace0-59c635454fcc" (UID: "d5ce8f21-34e2-4e8e-ace0-59c635454fcc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.057670 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d5ce8f21-34e2-4e8e-ace0-59c635454fcc" (UID: "d5ce8f21-34e2-4e8e-ace0-59c635454fcc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.131605 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84rp2\" (UniqueName: \"kubernetes.io/projected/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-kube-api-access-84rp2\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.131638 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.131646 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.131656 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5ce8f21-34e2-4e8e-ace0-59c635454fcc-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.487814 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" event={"ID":"d5ce8f21-34e2-4e8e-ace0-59c635454fcc","Type":"ContainerDied","Data":"c4ec4301d1259ec8722f28a170cd2adc3c0b50c38df281cd8e1936b480f6934b"} Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.487853 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4ec4301d1259ec8722f28a170cd2adc3c0b50c38df281cd8e1936b480f6934b" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.487911 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.565539 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2j2qp"] Oct 01 13:16:18 crc kubenswrapper[4913]: E1001 13:16:18.566065 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ce8f21-34e2-4e8e-ace0-59c635454fcc" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.566087 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ce8f21-34e2-4e8e-ace0-59c635454fcc" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.566408 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ce8f21-34e2-4e8e-ace0-59c635454fcc" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.567214 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.570467 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.570518 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.570550 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.570645 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.571318 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.574293 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2j2qp"] Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.743374 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2j2qp\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.743416 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-ceph\") pod \"ssh-known-hosts-edpm-deployment-2j2qp\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.743453 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2j2qp\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.743634 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sd7c\" (UniqueName: \"kubernetes.io/projected/2218b097-a775-4dca-88c6-d7676fc4ef97-kube-api-access-2sd7c\") pod \"ssh-known-hosts-edpm-deployment-2j2qp\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.845767 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2j2qp\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.845809 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-ceph\") pod \"ssh-known-hosts-edpm-deployment-2j2qp\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.845842 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2j2qp\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.845885 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sd7c\" (UniqueName: \"kubernetes.io/projected/2218b097-a775-4dca-88c6-d7676fc4ef97-kube-api-access-2sd7c\") pod \"ssh-known-hosts-edpm-deployment-2j2qp\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.850380 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2j2qp\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.850378 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-ceph\") pod \"ssh-known-hosts-edpm-deployment-2j2qp\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.851454 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2j2qp\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.863679 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sd7c\" (UniqueName: \"kubernetes.io/projected/2218b097-a775-4dca-88c6-d7676fc4ef97-kube-api-access-2sd7c\") pod \"ssh-known-hosts-edpm-deployment-2j2qp\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:18 crc kubenswrapper[4913]: I1001 13:16:18.884050 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:19 crc kubenswrapper[4913]: I1001 13:16:19.421386 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2j2qp"] Oct 01 13:16:19 crc kubenswrapper[4913]: I1001 13:16:19.498746 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" event={"ID":"2218b097-a775-4dca-88c6-d7676fc4ef97","Type":"ContainerStarted","Data":"1804fbe69e6b128bff852f90297a0893c6e0b1a0731765215ad7ee45c3d10b73"} Oct 01 13:16:20 crc kubenswrapper[4913]: I1001 13:16:20.509912 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" event={"ID":"2218b097-a775-4dca-88c6-d7676fc4ef97","Type":"ContainerStarted","Data":"cd21ca8ebf61b5a9de6c6b7f3b8eaa0190160ea61b3daeed8783b8e582b9e944"} Oct 01 13:16:20 crc kubenswrapper[4913]: I1001 13:16:20.552177 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" podStartSLOduration=1.932911085 podStartE2EDuration="2.552156894s" podCreationTimestamp="2025-10-01 13:16:18 +0000 UTC" firstStartedPulling="2025-10-01 13:16:19.432743057 +0000 UTC m=+2311.336218635" lastFinishedPulling="2025-10-01 13:16:20.051988866 +0000 UTC m=+2311.955464444" observedRunningTime="2025-10-01 13:16:20.549788659 +0000 UTC m=+2312.453264257" watchObservedRunningTime="2025-10-01 13:16:20.552156894 +0000 UTC m=+2312.455632482" Oct 01 13:16:23 crc kubenswrapper[4913]: I1001 13:16:23.806719 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:16:23 crc kubenswrapper[4913]: E1001 13:16:23.807237 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:16:30 crc kubenswrapper[4913]: I1001 13:16:30.623569 4913 generic.go:334] "Generic (PLEG): container finished" podID="2218b097-a775-4dca-88c6-d7676fc4ef97" containerID="cd21ca8ebf61b5a9de6c6b7f3b8eaa0190160ea61b3daeed8783b8e582b9e944" exitCode=0 Oct 01 13:16:30 crc kubenswrapper[4913]: I1001 13:16:30.623818 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" event={"ID":"2218b097-a775-4dca-88c6-d7676fc4ef97","Type":"ContainerDied","Data":"cd21ca8ebf61b5a9de6c6b7f3b8eaa0190160ea61b3daeed8783b8e582b9e944"} Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.083925 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.192196 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-ssh-key-openstack-edpm-ipam\") pod \"2218b097-a775-4dca-88c6-d7676fc4ef97\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.192419 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-ceph\") pod \"2218b097-a775-4dca-88c6-d7676fc4ef97\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.192505 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sd7c\" (UniqueName: \"kubernetes.io/projected/2218b097-a775-4dca-88c6-d7676fc4ef97-kube-api-access-2sd7c\") pod \"2218b097-a775-4dca-88c6-d7676fc4ef97\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.192540 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-inventory-0\") pod \"2218b097-a775-4dca-88c6-d7676fc4ef97\" (UID: \"2218b097-a775-4dca-88c6-d7676fc4ef97\") " Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.198838 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-ceph" (OuterVolumeSpecName: "ceph") pod "2218b097-a775-4dca-88c6-d7676fc4ef97" (UID: "2218b097-a775-4dca-88c6-d7676fc4ef97"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.199596 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2218b097-a775-4dca-88c6-d7676fc4ef97-kube-api-access-2sd7c" (OuterVolumeSpecName: "kube-api-access-2sd7c") pod "2218b097-a775-4dca-88c6-d7676fc4ef97" (UID: "2218b097-a775-4dca-88c6-d7676fc4ef97"). InnerVolumeSpecName "kube-api-access-2sd7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.220587 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "2218b097-a775-4dca-88c6-d7676fc4ef97" (UID: "2218b097-a775-4dca-88c6-d7676fc4ef97"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.223782 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2218b097-a775-4dca-88c6-d7676fc4ef97" (UID: "2218b097-a775-4dca-88c6-d7676fc4ef97"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.296214 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.296290 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.296316 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sd7c\" (UniqueName: \"kubernetes.io/projected/2218b097-a775-4dca-88c6-d7676fc4ef97-kube-api-access-2sd7c\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.296335 4913 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2218b097-a775-4dca-88c6-d7676fc4ef97-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.651236 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" event={"ID":"2218b097-a775-4dca-88c6-d7676fc4ef97","Type":"ContainerDied","Data":"1804fbe69e6b128bff852f90297a0893c6e0b1a0731765215ad7ee45c3d10b73"} Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.651292 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1804fbe69e6b128bff852f90297a0893c6e0b1a0731765215ad7ee45c3d10b73" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.652798 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2j2qp" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.710678 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r"] Oct 01 13:16:32 crc kubenswrapper[4913]: E1001 13:16:32.711144 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2218b097-a775-4dca-88c6-d7676fc4ef97" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.711163 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="2218b097-a775-4dca-88c6-d7676fc4ef97" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.711363 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="2218b097-a775-4dca-88c6-d7676fc4ef97" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.711993 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.713563 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.713755 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.715031 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.716344 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.716536 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.725100 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r"] Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.804320 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rpp5r\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.804389 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5kj6\" (UniqueName: \"kubernetes.io/projected/629d4b85-c10c-45cc-ad03-4954acadaf15-kube-api-access-h5kj6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rpp5r\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.804429 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rpp5r\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.804995 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rpp5r\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.907219 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rpp5r\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.907354 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rpp5r\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.907428 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5kj6\" (UniqueName: \"kubernetes.io/projected/629d4b85-c10c-45cc-ad03-4954acadaf15-kube-api-access-h5kj6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rpp5r\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.907495 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rpp5r\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.911433 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rpp5r\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.912254 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rpp5r\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.919911 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rpp5r\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:32 crc kubenswrapper[4913]: I1001 13:16:32.923582 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5kj6\" (UniqueName: \"kubernetes.io/projected/629d4b85-c10c-45cc-ad03-4954acadaf15-kube-api-access-h5kj6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rpp5r\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:33 crc kubenswrapper[4913]: I1001 13:16:33.044050 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:33 crc kubenswrapper[4913]: I1001 13:16:33.560656 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r"] Oct 01 13:16:33 crc kubenswrapper[4913]: I1001 13:16:33.660045 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" event={"ID":"629d4b85-c10c-45cc-ad03-4954acadaf15","Type":"ContainerStarted","Data":"4ca4388d5aa8c2033028c5e7da7823bff5896e75820e94cbf61cb63647629fdd"} Oct 01 13:16:34 crc kubenswrapper[4913]: I1001 13:16:34.669941 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" event={"ID":"629d4b85-c10c-45cc-ad03-4954acadaf15","Type":"ContainerStarted","Data":"434402bf6ea40fe87f825d9a33a17ede601dc34d5e957ca44723ec00a6f019e2"} Oct 01 13:16:34 crc kubenswrapper[4913]: I1001 13:16:34.695955 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" podStartSLOduration=2.181090244 podStartE2EDuration="2.695934078s" podCreationTimestamp="2025-10-01 13:16:32 +0000 UTC" firstStartedPulling="2025-10-01 13:16:33.561042194 +0000 UTC m=+2325.464517782" lastFinishedPulling="2025-10-01 13:16:34.075886038 +0000 UTC m=+2325.979361616" observedRunningTime="2025-10-01 13:16:34.688742381 +0000 UTC m=+2326.592218009" watchObservedRunningTime="2025-10-01 13:16:34.695934078 +0000 UTC m=+2326.599409666" Oct 01 13:16:35 crc kubenswrapper[4913]: I1001 13:16:35.806778 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:16:35 crc kubenswrapper[4913]: E1001 13:16:35.806994 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:16:43 crc kubenswrapper[4913]: I1001 13:16:43.753563 4913 generic.go:334] "Generic (PLEG): container finished" podID="629d4b85-c10c-45cc-ad03-4954acadaf15" containerID="434402bf6ea40fe87f825d9a33a17ede601dc34d5e957ca44723ec00a6f019e2" exitCode=0 Oct 01 13:16:43 crc kubenswrapper[4913]: I1001 13:16:43.753913 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" event={"ID":"629d4b85-c10c-45cc-ad03-4954acadaf15","Type":"ContainerDied","Data":"434402bf6ea40fe87f825d9a33a17ede601dc34d5e957ca44723ec00a6f019e2"} Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.147706 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.238747 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-inventory\") pod \"629d4b85-c10c-45cc-ad03-4954acadaf15\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.238817 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-ssh-key\") pod \"629d4b85-c10c-45cc-ad03-4954acadaf15\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.238907 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-ceph\") pod \"629d4b85-c10c-45cc-ad03-4954acadaf15\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.238952 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5kj6\" (UniqueName: \"kubernetes.io/projected/629d4b85-c10c-45cc-ad03-4954acadaf15-kube-api-access-h5kj6\") pod \"629d4b85-c10c-45cc-ad03-4954acadaf15\" (UID: \"629d4b85-c10c-45cc-ad03-4954acadaf15\") " Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.245113 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-ceph" (OuterVolumeSpecName: "ceph") pod "629d4b85-c10c-45cc-ad03-4954acadaf15" (UID: "629d4b85-c10c-45cc-ad03-4954acadaf15"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.245481 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/629d4b85-c10c-45cc-ad03-4954acadaf15-kube-api-access-h5kj6" (OuterVolumeSpecName: "kube-api-access-h5kj6") pod "629d4b85-c10c-45cc-ad03-4954acadaf15" (UID: "629d4b85-c10c-45cc-ad03-4954acadaf15"). InnerVolumeSpecName "kube-api-access-h5kj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.271083 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-inventory" (OuterVolumeSpecName: "inventory") pod "629d4b85-c10c-45cc-ad03-4954acadaf15" (UID: "629d4b85-c10c-45cc-ad03-4954acadaf15"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.274516 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "629d4b85-c10c-45cc-ad03-4954acadaf15" (UID: "629d4b85-c10c-45cc-ad03-4954acadaf15"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.340894 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.340932 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.340941 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/629d4b85-c10c-45cc-ad03-4954acadaf15-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.340950 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5kj6\" (UniqueName: \"kubernetes.io/projected/629d4b85-c10c-45cc-ad03-4954acadaf15-kube-api-access-h5kj6\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.772599 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" event={"ID":"629d4b85-c10c-45cc-ad03-4954acadaf15","Type":"ContainerDied","Data":"4ca4388d5aa8c2033028c5e7da7823bff5896e75820e94cbf61cb63647629fdd"} Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.772635 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rpp5r" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.772659 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ca4388d5aa8c2033028c5e7da7823bff5896e75820e94cbf61cb63647629fdd" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.849237 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52"] Oct 01 13:16:45 crc kubenswrapper[4913]: E1001 13:16:45.849625 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="629d4b85-c10c-45cc-ad03-4954acadaf15" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.849646 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="629d4b85-c10c-45cc-ad03-4954acadaf15" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.849841 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="629d4b85-c10c-45cc-ad03-4954acadaf15" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.850439 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.854080 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.854089 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.854436 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.854089 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.854300 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.865317 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52"] Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.951361 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.951410 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.951507 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:45 crc kubenswrapper[4913]: I1001 13:16:45.951541 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqp4n\" (UniqueName: \"kubernetes.io/projected/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-kube-api-access-lqp4n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:46 crc kubenswrapper[4913]: I1001 13:16:46.052842 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqp4n\" (UniqueName: \"kubernetes.io/projected/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-kube-api-access-lqp4n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:46 crc kubenswrapper[4913]: I1001 13:16:46.053115 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:46 crc kubenswrapper[4913]: I1001 13:16:46.053198 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:46 crc kubenswrapper[4913]: I1001 13:16:46.053344 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:46 crc kubenswrapper[4913]: I1001 13:16:46.058648 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:46 crc kubenswrapper[4913]: I1001 13:16:46.058865 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:46 crc kubenswrapper[4913]: I1001 13:16:46.061158 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:46 crc kubenswrapper[4913]: I1001 13:16:46.071759 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqp4n\" (UniqueName: \"kubernetes.io/projected/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-kube-api-access-lqp4n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:46 crc kubenswrapper[4913]: I1001 13:16:46.172068 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:46 crc kubenswrapper[4913]: I1001 13:16:46.513216 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52"] Oct 01 13:16:46 crc kubenswrapper[4913]: W1001 13:16:46.519230 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6dcbc19_3406_42c7_bc2c_8b17e750a3cd.slice/crio-da78b01551d7b3311b547a189abd1f0fac1a2cbb0f20e2f3132b06e9029d1ca3 WatchSource:0}: Error finding container da78b01551d7b3311b547a189abd1f0fac1a2cbb0f20e2f3132b06e9029d1ca3: Status 404 returned error can't find the container with id da78b01551d7b3311b547a189abd1f0fac1a2cbb0f20e2f3132b06e9029d1ca3 Oct 01 13:16:46 crc kubenswrapper[4913]: I1001 13:16:46.784035 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" event={"ID":"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd","Type":"ContainerStarted","Data":"da78b01551d7b3311b547a189abd1f0fac1a2cbb0f20e2f3132b06e9029d1ca3"} Oct 01 13:16:47 crc kubenswrapper[4913]: I1001 13:16:47.795400 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" event={"ID":"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd","Type":"ContainerStarted","Data":"76b6016055366d7785cb5c35b43226b354c9bb892583d1d37c28e2732bcae337"} Oct 01 13:16:47 crc kubenswrapper[4913]: I1001 13:16:47.829624 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" podStartSLOduration=2.091891001 podStartE2EDuration="2.82959557s" podCreationTimestamp="2025-10-01 13:16:45 +0000 UTC" firstStartedPulling="2025-10-01 13:16:46.521070742 +0000 UTC m=+2338.424546321" lastFinishedPulling="2025-10-01 13:16:47.258775312 +0000 UTC m=+2339.162250890" observedRunningTime="2025-10-01 13:16:47.817404164 +0000 UTC m=+2339.720879852" watchObservedRunningTime="2025-10-01 13:16:47.82959557 +0000 UTC m=+2339.733071188" Oct 01 13:16:50 crc kubenswrapper[4913]: I1001 13:16:50.808500 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:16:50 crc kubenswrapper[4913]: E1001 13:16:50.809888 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:16:57 crc kubenswrapper[4913]: I1001 13:16:57.905150 4913 generic.go:334] "Generic (PLEG): container finished" podID="a6dcbc19-3406-42c7-bc2c-8b17e750a3cd" containerID="76b6016055366d7785cb5c35b43226b354c9bb892583d1d37c28e2732bcae337" exitCode=0 Oct 01 13:16:57 crc kubenswrapper[4913]: I1001 13:16:57.905227 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" event={"ID":"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd","Type":"ContainerDied","Data":"76b6016055366d7785cb5c35b43226b354c9bb892583d1d37c28e2732bcae337"} Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.426821 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.526656 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-ceph\") pod \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.526783 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqp4n\" (UniqueName: \"kubernetes.io/projected/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-kube-api-access-lqp4n\") pod \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.526818 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-ssh-key\") pod \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.526848 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-inventory\") pod \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\" (UID: \"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd\") " Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.533146 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-ceph" (OuterVolumeSpecName: "ceph") pod "a6dcbc19-3406-42c7-bc2c-8b17e750a3cd" (UID: "a6dcbc19-3406-42c7-bc2c-8b17e750a3cd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.534619 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-kube-api-access-lqp4n" (OuterVolumeSpecName: "kube-api-access-lqp4n") pod "a6dcbc19-3406-42c7-bc2c-8b17e750a3cd" (UID: "a6dcbc19-3406-42c7-bc2c-8b17e750a3cd"). InnerVolumeSpecName "kube-api-access-lqp4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.567153 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a6dcbc19-3406-42c7-bc2c-8b17e750a3cd" (UID: "a6dcbc19-3406-42c7-bc2c-8b17e750a3cd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.574466 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-inventory" (OuterVolumeSpecName: "inventory") pod "a6dcbc19-3406-42c7-bc2c-8b17e750a3cd" (UID: "a6dcbc19-3406-42c7-bc2c-8b17e750a3cd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.628805 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqp4n\" (UniqueName: \"kubernetes.io/projected/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-kube-api-access-lqp4n\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.628833 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.628841 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.628849 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6dcbc19-3406-42c7-bc2c-8b17e750a3cd-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.922592 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" event={"ID":"a6dcbc19-3406-42c7-bc2c-8b17e750a3cd","Type":"ContainerDied","Data":"da78b01551d7b3311b547a189abd1f0fac1a2cbb0f20e2f3132b06e9029d1ca3"} Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.922635 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da78b01551d7b3311b547a189abd1f0fac1a2cbb0f20e2f3132b06e9029d1ca3" Oct 01 13:16:59 crc kubenswrapper[4913]: I1001 13:16:59.922685 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.006117 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg"] Oct 01 13:17:00 crc kubenswrapper[4913]: E1001 13:17:00.006627 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6dcbc19-3406-42c7-bc2c-8b17e750a3cd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.006651 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6dcbc19-3406-42c7-bc2c-8b17e750a3cd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.006904 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6dcbc19-3406-42c7-bc2c-8b17e750a3cd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.007721 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.009906 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.009950 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.013252 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.013469 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.013484 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.013999 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.014053 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.015428 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.022283 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg"] Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.137712 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.138009 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.138160 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.138339 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.138453 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.138562 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.138688 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggq6x\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-kube-api-access-ggq6x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.138802 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.138919 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.139028 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.139127 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.139261 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.139402 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.241119 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.241173 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.241233 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.241306 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.241441 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.241471 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.241573 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggq6x\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-kube-api-access-ggq6x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.242012 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.242079 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.242106 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.242162 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.242308 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.242338 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.246923 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.248053 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.249066 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.249076 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.250162 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.250191 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.250342 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.250736 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.250736 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.251095 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.251208 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.256146 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.261034 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggq6x\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-kube-api-access-ggq6x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.325007 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:00 crc kubenswrapper[4913]: I1001 13:17:00.981085 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg"] Oct 01 13:17:01 crc kubenswrapper[4913]: I1001 13:17:01.945605 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" event={"ID":"f4b6596a-2679-4c64-99f5-a966e8a3deef","Type":"ContainerStarted","Data":"c333b11959d2e28eb6ac2678cf39b16d6386778e153ec5ee29b2e491c02ea8fa"} Oct 01 13:17:01 crc kubenswrapper[4913]: I1001 13:17:01.946161 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" event={"ID":"f4b6596a-2679-4c64-99f5-a966e8a3deef","Type":"ContainerStarted","Data":"0c719710b99ba56d073ed13a1c823d63a481f1a9b4a87e3576b302431bc9c92b"} Oct 01 13:17:01 crc kubenswrapper[4913]: I1001 13:17:01.982107 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" podStartSLOduration=2.424096439 podStartE2EDuration="2.982073784s" podCreationTimestamp="2025-10-01 13:16:59 +0000 UTC" firstStartedPulling="2025-10-01 13:17:00.983964398 +0000 UTC m=+2352.887439976" lastFinishedPulling="2025-10-01 13:17:01.541941733 +0000 UTC m=+2353.445417321" observedRunningTime="2025-10-01 13:17:01.968212684 +0000 UTC m=+2353.871688292" watchObservedRunningTime="2025-10-01 13:17:01.982073784 +0000 UTC m=+2353.885549402" Oct 01 13:17:04 crc kubenswrapper[4913]: I1001 13:17:04.806740 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:17:04 crc kubenswrapper[4913]: E1001 13:17:04.807638 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:17:19 crc kubenswrapper[4913]: I1001 13:17:19.806767 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:17:19 crc kubenswrapper[4913]: E1001 13:17:19.807559 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:17:30 crc kubenswrapper[4913]: I1001 13:17:30.807447 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:17:30 crc kubenswrapper[4913]: E1001 13:17:30.809399 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:17:35 crc kubenswrapper[4913]: I1001 13:17:35.285842 4913 generic.go:334] "Generic (PLEG): container finished" podID="f4b6596a-2679-4c64-99f5-a966e8a3deef" containerID="c333b11959d2e28eb6ac2678cf39b16d6386778e153ec5ee29b2e491c02ea8fa" exitCode=0 Oct 01 13:17:35 crc kubenswrapper[4913]: I1001 13:17:35.285898 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" event={"ID":"f4b6596a-2679-4c64-99f5-a966e8a3deef","Type":"ContainerDied","Data":"c333b11959d2e28eb6ac2678cf39b16d6386778e153ec5ee29b2e491c02ea8fa"} Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.699616 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.889155 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-repo-setup-combined-ca-bundle\") pod \"f4b6596a-2679-4c64-99f5-a966e8a3deef\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.889361 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ceph\") pod \"f4b6596a-2679-4c64-99f5-a966e8a3deef\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.890049 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-ovn-default-certs-0\") pod \"f4b6596a-2679-4c64-99f5-a966e8a3deef\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.890148 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-neutron-metadata-combined-ca-bundle\") pod \"f4b6596a-2679-4c64-99f5-a966e8a3deef\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.890182 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-bootstrap-combined-ca-bundle\") pod \"f4b6596a-2679-4c64-99f5-a966e8a3deef\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.890208 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ovn-combined-ca-bundle\") pod \"f4b6596a-2679-4c64-99f5-a966e8a3deef\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.890249 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-inventory\") pod \"f4b6596a-2679-4c64-99f5-a966e8a3deef\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.890321 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"f4b6596a-2679-4c64-99f5-a966e8a3deef\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.890429 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggq6x\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-kube-api-access-ggq6x\") pod \"f4b6596a-2679-4c64-99f5-a966e8a3deef\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.890887 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"f4b6596a-2679-4c64-99f5-a966e8a3deef\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.890949 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-libvirt-combined-ca-bundle\") pod \"f4b6596a-2679-4c64-99f5-a966e8a3deef\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.891026 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ssh-key\") pod \"f4b6596a-2679-4c64-99f5-a966e8a3deef\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.891066 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-nova-combined-ca-bundle\") pod \"f4b6596a-2679-4c64-99f5-a966e8a3deef\" (UID: \"f4b6596a-2679-4c64-99f5-a966e8a3deef\") " Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.896326 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "f4b6596a-2679-4c64-99f5-a966e8a3deef" (UID: "f4b6596a-2679-4c64-99f5-a966e8a3deef"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.896388 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "f4b6596a-2679-4c64-99f5-a966e8a3deef" (UID: "f4b6596a-2679-4c64-99f5-a966e8a3deef"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.896406 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ceph" (OuterVolumeSpecName: "ceph") pod "f4b6596a-2679-4c64-99f5-a966e8a3deef" (UID: "f4b6596a-2679-4c64-99f5-a966e8a3deef"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.896445 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "f4b6596a-2679-4c64-99f5-a966e8a3deef" (UID: "f4b6596a-2679-4c64-99f5-a966e8a3deef"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.897601 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f4b6596a-2679-4c64-99f5-a966e8a3deef" (UID: "f4b6596a-2679-4c64-99f5-a966e8a3deef"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.898214 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f4b6596a-2679-4c64-99f5-a966e8a3deef" (UID: "f4b6596a-2679-4c64-99f5-a966e8a3deef"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.898643 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-kube-api-access-ggq6x" (OuterVolumeSpecName: "kube-api-access-ggq6x") pod "f4b6596a-2679-4c64-99f5-a966e8a3deef" (UID: "f4b6596a-2679-4c64-99f5-a966e8a3deef"). InnerVolumeSpecName "kube-api-access-ggq6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.901343 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f4b6596a-2679-4c64-99f5-a966e8a3deef" (UID: "f4b6596a-2679-4c64-99f5-a966e8a3deef"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.901386 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f4b6596a-2679-4c64-99f5-a966e8a3deef" (UID: "f4b6596a-2679-4c64-99f5-a966e8a3deef"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.901928 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f4b6596a-2679-4c64-99f5-a966e8a3deef" (UID: "f4b6596a-2679-4c64-99f5-a966e8a3deef"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.903406 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "f4b6596a-2679-4c64-99f5-a966e8a3deef" (UID: "f4b6596a-2679-4c64-99f5-a966e8a3deef"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.920906 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-inventory" (OuterVolumeSpecName: "inventory") pod "f4b6596a-2679-4c64-99f5-a966e8a3deef" (UID: "f4b6596a-2679-4c64-99f5-a966e8a3deef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.926903 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f4b6596a-2679-4c64-99f5-a966e8a3deef" (UID: "f4b6596a-2679-4c64-99f5-a966e8a3deef"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.994676 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggq6x\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-kube-api-access-ggq6x\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.994717 4913 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.994732 4913 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.994744 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.994755 4913 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.994766 4913 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.994777 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.994800 4913 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.994824 4913 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.994838 4913 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.994851 4913 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.994862 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4b6596a-2679-4c64-99f5-a966e8a3deef-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:36 crc kubenswrapper[4913]: I1001 13:17:36.994876 4913 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f4b6596a-2679-4c64-99f5-a966e8a3deef-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.312647 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" event={"ID":"f4b6596a-2679-4c64-99f5-a966e8a3deef","Type":"ContainerDied","Data":"0c719710b99ba56d073ed13a1c823d63a481f1a9b4a87e3576b302431bc9c92b"} Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.312721 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c719710b99ba56d073ed13a1c823d63a481f1a9b4a87e3576b302431bc9c92b" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.312731 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.428081 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt"] Oct 01 13:17:37 crc kubenswrapper[4913]: E1001 13:17:37.428563 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b6596a-2679-4c64-99f5-a966e8a3deef" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.428589 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b6596a-2679-4c64-99f5-a966e8a3deef" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.428851 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b6596a-2679-4c64-99f5-a966e8a3deef" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.429751 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.432770 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.432980 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.433284 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.433518 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.433548 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.442832 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt"] Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.509165 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.509366 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.509480 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.509517 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8r6j\" (UniqueName: \"kubernetes.io/projected/0ee105ca-a1e0-4566-bba7-bba5eca729f0-kube-api-access-f8r6j\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.611046 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.611413 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.611435 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8r6j\" (UniqueName: \"kubernetes.io/projected/0ee105ca-a1e0-4566-bba7-bba5eca729f0-kube-api-access-f8r6j\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.611519 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.618219 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.618361 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.631778 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.637771 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8r6j\" (UniqueName: \"kubernetes.io/projected/0ee105ca-a1e0-4566-bba7-bba5eca729f0-kube-api-access-f8r6j\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:37 crc kubenswrapper[4913]: I1001 13:17:37.746236 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:38 crc kubenswrapper[4913]: I1001 13:17:38.403421 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt"] Oct 01 13:17:38 crc kubenswrapper[4913]: W1001 13:17:38.415182 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ee105ca_a1e0_4566_bba7_bba5eca729f0.slice/crio-4803811297f2c7dc477010a73ab643801a0db50cf1ac008519914abf1b886c98 WatchSource:0}: Error finding container 4803811297f2c7dc477010a73ab643801a0db50cf1ac008519914abf1b886c98: Status 404 returned error can't find the container with id 4803811297f2c7dc477010a73ab643801a0db50cf1ac008519914abf1b886c98 Oct 01 13:17:38 crc kubenswrapper[4913]: I1001 13:17:38.421404 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:17:39 crc kubenswrapper[4913]: I1001 13:17:39.336615 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" event={"ID":"0ee105ca-a1e0-4566-bba7-bba5eca729f0","Type":"ContainerStarted","Data":"dcc4659b23d670bc26f941952b687c73a16af4de0f5395cce22102e5db58eace"} Oct 01 13:17:39 crc kubenswrapper[4913]: I1001 13:17:39.337233 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" event={"ID":"0ee105ca-a1e0-4566-bba7-bba5eca729f0","Type":"ContainerStarted","Data":"4803811297f2c7dc477010a73ab643801a0db50cf1ac008519914abf1b886c98"} Oct 01 13:17:39 crc kubenswrapper[4913]: I1001 13:17:39.359338 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" podStartSLOduration=1.868243122 podStartE2EDuration="2.359315291s" podCreationTimestamp="2025-10-01 13:17:37 +0000 UTC" firstStartedPulling="2025-10-01 13:17:38.42115357 +0000 UTC m=+2390.324629148" lastFinishedPulling="2025-10-01 13:17:38.912225739 +0000 UTC m=+2390.815701317" observedRunningTime="2025-10-01 13:17:39.353597174 +0000 UTC m=+2391.257072772" watchObservedRunningTime="2025-10-01 13:17:39.359315291 +0000 UTC m=+2391.262790869" Oct 01 13:17:44 crc kubenswrapper[4913]: I1001 13:17:44.379905 4913 generic.go:334] "Generic (PLEG): container finished" podID="0ee105ca-a1e0-4566-bba7-bba5eca729f0" containerID="dcc4659b23d670bc26f941952b687c73a16af4de0f5395cce22102e5db58eace" exitCode=0 Oct 01 13:17:44 crc kubenswrapper[4913]: I1001 13:17:44.379967 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" event={"ID":"0ee105ca-a1e0-4566-bba7-bba5eca729f0","Type":"ContainerDied","Data":"dcc4659b23d670bc26f941952b687c73a16af4de0f5395cce22102e5db58eace"} Oct 01 13:17:44 crc kubenswrapper[4913]: I1001 13:17:44.806877 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:17:44 crc kubenswrapper[4913]: E1001 13:17:44.807765 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:17:45 crc kubenswrapper[4913]: I1001 13:17:45.868329 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:45 crc kubenswrapper[4913]: I1001 13:17:45.959954 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8r6j\" (UniqueName: \"kubernetes.io/projected/0ee105ca-a1e0-4566-bba7-bba5eca729f0-kube-api-access-f8r6j\") pod \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " Oct 01 13:17:45 crc kubenswrapper[4913]: I1001 13:17:45.960123 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-ceph\") pod \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " Oct 01 13:17:45 crc kubenswrapper[4913]: I1001 13:17:45.960176 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-ssh-key\") pod \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " Oct 01 13:17:45 crc kubenswrapper[4913]: I1001 13:17:45.960214 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-inventory\") pod \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\" (UID: \"0ee105ca-a1e0-4566-bba7-bba5eca729f0\") " Oct 01 13:17:45 crc kubenswrapper[4913]: I1001 13:17:45.966291 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-ceph" (OuterVolumeSpecName: "ceph") pod "0ee105ca-a1e0-4566-bba7-bba5eca729f0" (UID: "0ee105ca-a1e0-4566-bba7-bba5eca729f0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:17:45 crc kubenswrapper[4913]: I1001 13:17:45.966536 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee105ca-a1e0-4566-bba7-bba5eca729f0-kube-api-access-f8r6j" (OuterVolumeSpecName: "kube-api-access-f8r6j") pod "0ee105ca-a1e0-4566-bba7-bba5eca729f0" (UID: "0ee105ca-a1e0-4566-bba7-bba5eca729f0"). InnerVolumeSpecName "kube-api-access-f8r6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:17:45 crc kubenswrapper[4913]: I1001 13:17:45.984868 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0ee105ca-a1e0-4566-bba7-bba5eca729f0" (UID: "0ee105ca-a1e0-4566-bba7-bba5eca729f0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.006508 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-inventory" (OuterVolumeSpecName: "inventory") pod "0ee105ca-a1e0-4566-bba7-bba5eca729f0" (UID: "0ee105ca-a1e0-4566-bba7-bba5eca729f0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.062223 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.062259 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.062285 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ee105ca-a1e0-4566-bba7-bba5eca729f0-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.062295 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8r6j\" (UniqueName: \"kubernetes.io/projected/0ee105ca-a1e0-4566-bba7-bba5eca729f0-kube-api-access-f8r6j\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.407919 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" event={"ID":"0ee105ca-a1e0-4566-bba7-bba5eca729f0","Type":"ContainerDied","Data":"4803811297f2c7dc477010a73ab643801a0db50cf1ac008519914abf1b886c98"} Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.408040 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.408256 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4803811297f2c7dc477010a73ab643801a0db50cf1ac008519914abf1b886c98" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.493358 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v"] Oct 01 13:17:46 crc kubenswrapper[4913]: E1001 13:17:46.494604 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee105ca-a1e0-4566-bba7-bba5eca729f0" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.495015 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee105ca-a1e0-4566-bba7-bba5eca729f0" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.495353 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ee105ca-a1e0-4566-bba7-bba5eca729f0" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.496175 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.500107 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.500644 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.500860 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.500936 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.502607 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.502875 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.516162 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v"] Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.572205 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.572263 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdmhm\" (UniqueName: \"kubernetes.io/projected/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-kube-api-access-tdmhm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.572345 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.572412 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.572437 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.572470 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.674310 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.674399 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdmhm\" (UniqueName: \"kubernetes.io/projected/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-kube-api-access-tdmhm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.674462 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.674500 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.674528 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.674559 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.675659 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.681818 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.683251 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.683906 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.684434 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.706196 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdmhm\" (UniqueName: \"kubernetes.io/projected/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-kube-api-access-tdmhm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kpl2v\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:46 crc kubenswrapper[4913]: I1001 13:17:46.812938 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:17:47 crc kubenswrapper[4913]: I1001 13:17:47.200283 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v"] Oct 01 13:17:47 crc kubenswrapper[4913]: W1001 13:17:47.208521 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod910bdd9f_9b3e_43a2_af16_1c392e80d9ed.slice/crio-bd4fa3e3c70b1744092b7d3ea2538152f802d7857e4c73a7248790886c993c74 WatchSource:0}: Error finding container bd4fa3e3c70b1744092b7d3ea2538152f802d7857e4c73a7248790886c993c74: Status 404 returned error can't find the container with id bd4fa3e3c70b1744092b7d3ea2538152f802d7857e4c73a7248790886c993c74 Oct 01 13:17:47 crc kubenswrapper[4913]: I1001 13:17:47.419189 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" event={"ID":"910bdd9f-9b3e-43a2-af16-1c392e80d9ed","Type":"ContainerStarted","Data":"bd4fa3e3c70b1744092b7d3ea2538152f802d7857e4c73a7248790886c993c74"} Oct 01 13:17:48 crc kubenswrapper[4913]: I1001 13:17:48.431743 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" event={"ID":"910bdd9f-9b3e-43a2-af16-1c392e80d9ed","Type":"ContainerStarted","Data":"ae7c7a555fc45f48b52d73b3b2e6c50b3c4d498b8cb818d508a3dd214ca7560f"} Oct 01 13:17:48 crc kubenswrapper[4913]: I1001 13:17:48.467209 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" podStartSLOduration=1.8570939499999999 podStartE2EDuration="2.467174525s" podCreationTimestamp="2025-10-01 13:17:46 +0000 UTC" firstStartedPulling="2025-10-01 13:17:47.210461681 +0000 UTC m=+2399.113937259" lastFinishedPulling="2025-10-01 13:17:47.820542246 +0000 UTC m=+2399.724017834" observedRunningTime="2025-10-01 13:17:48.450059175 +0000 UTC m=+2400.353534783" watchObservedRunningTime="2025-10-01 13:17:48.467174525 +0000 UTC m=+2400.370650143" Oct 01 13:17:58 crc kubenswrapper[4913]: I1001 13:17:58.813472 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:17:58 crc kubenswrapper[4913]: E1001 13:17:58.814290 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:18:11 crc kubenswrapper[4913]: I1001 13:18:11.807183 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:18:11 crc kubenswrapper[4913]: E1001 13:18:11.808324 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:18:23 crc kubenswrapper[4913]: I1001 13:18:23.806440 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:18:23 crc kubenswrapper[4913]: E1001 13:18:23.807173 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:18:37 crc kubenswrapper[4913]: I1001 13:18:37.807111 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:18:37 crc kubenswrapper[4913]: E1001 13:18:37.808457 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:18:51 crc kubenswrapper[4913]: I1001 13:18:51.806599 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:18:51 crc kubenswrapper[4913]: E1001 13:18:51.807410 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:19:02 crc kubenswrapper[4913]: I1001 13:19:02.807137 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:19:02 crc kubenswrapper[4913]: E1001 13:19:02.808390 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:19:07 crc kubenswrapper[4913]: I1001 13:19:07.264962 4913 generic.go:334] "Generic (PLEG): container finished" podID="910bdd9f-9b3e-43a2-af16-1c392e80d9ed" containerID="ae7c7a555fc45f48b52d73b3b2e6c50b3c4d498b8cb818d508a3dd214ca7560f" exitCode=0 Oct 01 13:19:07 crc kubenswrapper[4913]: I1001 13:19:07.265111 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" event={"ID":"910bdd9f-9b3e-43a2-af16-1c392e80d9ed","Type":"ContainerDied","Data":"ae7c7a555fc45f48b52d73b3b2e6c50b3c4d498b8cb818d508a3dd214ca7560f"} Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.702397 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.849451 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-inventory\") pod \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.850491 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ovncontroller-config-0\") pod \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.850724 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ovn-combined-ca-bundle\") pod \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.851064 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ceph\") pod \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.851132 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ssh-key\") pod \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.851263 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdmhm\" (UniqueName: \"kubernetes.io/projected/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-kube-api-access-tdmhm\") pod \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\" (UID: \"910bdd9f-9b3e-43a2-af16-1c392e80d9ed\") " Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.857852 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "910bdd9f-9b3e-43a2-af16-1c392e80d9ed" (UID: "910bdd9f-9b3e-43a2-af16-1c392e80d9ed"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.864462 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ceph" (OuterVolumeSpecName: "ceph") pod "910bdd9f-9b3e-43a2-af16-1c392e80d9ed" (UID: "910bdd9f-9b3e-43a2-af16-1c392e80d9ed"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.864562 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-kube-api-access-tdmhm" (OuterVolumeSpecName: "kube-api-access-tdmhm") pod "910bdd9f-9b3e-43a2-af16-1c392e80d9ed" (UID: "910bdd9f-9b3e-43a2-af16-1c392e80d9ed"). InnerVolumeSpecName "kube-api-access-tdmhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.883252 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "910bdd9f-9b3e-43a2-af16-1c392e80d9ed" (UID: "910bdd9f-9b3e-43a2-af16-1c392e80d9ed"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.888941 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "910bdd9f-9b3e-43a2-af16-1c392e80d9ed" (UID: "910bdd9f-9b3e-43a2-af16-1c392e80d9ed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.896371 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-inventory" (OuterVolumeSpecName: "inventory") pod "910bdd9f-9b3e-43a2-af16-1c392e80d9ed" (UID: "910bdd9f-9b3e-43a2-af16-1c392e80d9ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.954126 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdmhm\" (UniqueName: \"kubernetes.io/projected/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-kube-api-access-tdmhm\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.954178 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.954197 4913 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.954214 4913 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.954232 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:08 crc kubenswrapper[4913]: I1001 13:19:08.954248 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/910bdd9f-9b3e-43a2-af16-1c392e80d9ed-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.292666 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" event={"ID":"910bdd9f-9b3e-43a2-af16-1c392e80d9ed","Type":"ContainerDied","Data":"bd4fa3e3c70b1744092b7d3ea2538152f802d7857e4c73a7248790886c993c74"} Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.292710 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd4fa3e3c70b1744092b7d3ea2538152f802d7857e4c73a7248790886c993c74" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.292815 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kpl2v" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.399944 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z"] Oct 01 13:19:09 crc kubenswrapper[4913]: E1001 13:19:09.400353 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910bdd9f-9b3e-43a2-af16-1c392e80d9ed" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.400373 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="910bdd9f-9b3e-43a2-af16-1c392e80d9ed" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.400574 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="910bdd9f-9b3e-43a2-af16-1c392e80d9ed" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.401217 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.412398 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.412778 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.413073 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.413245 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.413340 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.419115 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.420378 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.443422 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z"] Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.564949 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkt7v\" (UniqueName: \"kubernetes.io/projected/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-kube-api-access-mkt7v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.565142 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.565254 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.565491 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.565585 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.565759 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.565842 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.668192 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.668533 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.668697 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.668874 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkt7v\" (UniqueName: \"kubernetes.io/projected/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-kube-api-access-mkt7v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.669024 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.669159 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.669297 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.674346 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.675039 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.675685 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.675916 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.676347 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.676386 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.692688 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkt7v\" (UniqueName: \"kubernetes.io/projected/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-kube-api-access-mkt7v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:09 crc kubenswrapper[4913]: I1001 13:19:09.747872 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:19:10 crc kubenswrapper[4913]: I1001 13:19:10.293348 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z"] Oct 01 13:19:10 crc kubenswrapper[4913]: W1001 13:19:10.305095 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode92e49b3_b2d4_41f0_9933_ea43cc692f5a.slice/crio-65eec237f751f5ea87a2e82e99ef31cee005e2d0387a731dc18442952c6e6e7b WatchSource:0}: Error finding container 65eec237f751f5ea87a2e82e99ef31cee005e2d0387a731dc18442952c6e6e7b: Status 404 returned error can't find the container with id 65eec237f751f5ea87a2e82e99ef31cee005e2d0387a731dc18442952c6e6e7b Oct 01 13:19:11 crc kubenswrapper[4913]: I1001 13:19:11.314157 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" event={"ID":"e92e49b3-b2d4-41f0-9933-ea43cc692f5a","Type":"ContainerStarted","Data":"a9f840989ea22e2ab810bd1830dfd56728bc677bc5135334b709c8d2a82d8b4e"} Oct 01 13:19:11 crc kubenswrapper[4913]: I1001 13:19:11.314823 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" event={"ID":"e92e49b3-b2d4-41f0-9933-ea43cc692f5a","Type":"ContainerStarted","Data":"65eec237f751f5ea87a2e82e99ef31cee005e2d0387a731dc18442952c6e6e7b"} Oct 01 13:19:11 crc kubenswrapper[4913]: I1001 13:19:11.352515 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" podStartSLOduration=1.782224681 podStartE2EDuration="2.352495024s" podCreationTimestamp="2025-10-01 13:19:09 +0000 UTC" firstStartedPulling="2025-10-01 13:19:10.309854726 +0000 UTC m=+2482.213330304" lastFinishedPulling="2025-10-01 13:19:10.880125059 +0000 UTC m=+2482.783600647" observedRunningTime="2025-10-01 13:19:11.342032178 +0000 UTC m=+2483.245507856" watchObservedRunningTime="2025-10-01 13:19:11.352495024 +0000 UTC m=+2483.255970612" Oct 01 13:19:17 crc kubenswrapper[4913]: I1001 13:19:17.807617 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:19:18 crc kubenswrapper[4913]: I1001 13:19:18.426195 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"1b76ae050c378f5829f6b759e6437f37d875093ba22902111d9211cf256e9b0f"} Oct 01 13:20:10 crc kubenswrapper[4913]: I1001 13:20:10.933616 4913 generic.go:334] "Generic (PLEG): container finished" podID="e92e49b3-b2d4-41f0-9933-ea43cc692f5a" containerID="a9f840989ea22e2ab810bd1830dfd56728bc677bc5135334b709c8d2a82d8b4e" exitCode=0 Oct 01 13:20:10 crc kubenswrapper[4913]: I1001 13:20:10.933696 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" event={"ID":"e92e49b3-b2d4-41f0-9933-ea43cc692f5a","Type":"ContainerDied","Data":"a9f840989ea22e2ab810bd1830dfd56728bc677bc5135334b709c8d2a82d8b4e"} Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.418397 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.480911 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-neutron-metadata-combined-ca-bundle\") pod \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.480999 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.481115 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-ceph\") pod \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.481321 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkt7v\" (UniqueName: \"kubernetes.io/projected/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-kube-api-access-mkt7v\") pod \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.481392 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-ssh-key\") pod \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.481410 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-nova-metadata-neutron-config-0\") pod \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.481466 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-inventory\") pod \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\" (UID: \"e92e49b3-b2d4-41f0-9933-ea43cc692f5a\") " Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.487603 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-ceph" (OuterVolumeSpecName: "ceph") pod "e92e49b3-b2d4-41f0-9933-ea43cc692f5a" (UID: "e92e49b3-b2d4-41f0-9933-ea43cc692f5a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.487643 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-kube-api-access-mkt7v" (OuterVolumeSpecName: "kube-api-access-mkt7v") pod "e92e49b3-b2d4-41f0-9933-ea43cc692f5a" (UID: "e92e49b3-b2d4-41f0-9933-ea43cc692f5a"). InnerVolumeSpecName "kube-api-access-mkt7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.489562 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e92e49b3-b2d4-41f0-9933-ea43cc692f5a" (UID: "e92e49b3-b2d4-41f0-9933-ea43cc692f5a"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.513730 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e92e49b3-b2d4-41f0-9933-ea43cc692f5a" (UID: "e92e49b3-b2d4-41f0-9933-ea43cc692f5a"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.514439 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-inventory" (OuterVolumeSpecName: "inventory") pod "e92e49b3-b2d4-41f0-9933-ea43cc692f5a" (UID: "e92e49b3-b2d4-41f0-9933-ea43cc692f5a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.515073 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e92e49b3-b2d4-41f0-9933-ea43cc692f5a" (UID: "e92e49b3-b2d4-41f0-9933-ea43cc692f5a"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.518655 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e92e49b3-b2d4-41f0-9933-ea43cc692f5a" (UID: "e92e49b3-b2d4-41f0-9933-ea43cc692f5a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.584448 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.584481 4913 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.584496 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.584510 4913 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.584525 4913 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.584536 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.584548 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkt7v\" (UniqueName: \"kubernetes.io/projected/e92e49b3-b2d4-41f0-9933-ea43cc692f5a-kube-api-access-mkt7v\") on node \"crc\" DevicePath \"\"" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.956346 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" event={"ID":"e92e49b3-b2d4-41f0-9933-ea43cc692f5a","Type":"ContainerDied","Data":"65eec237f751f5ea87a2e82e99ef31cee005e2d0387a731dc18442952c6e6e7b"} Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.956410 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65eec237f751f5ea87a2e82e99ef31cee005e2d0387a731dc18442952c6e6e7b" Oct 01 13:20:12 crc kubenswrapper[4913]: I1001 13:20:12.956678 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.143733 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m"] Oct 01 13:20:13 crc kubenswrapper[4913]: E1001 13:20:13.144133 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e92e49b3-b2d4-41f0-9933-ea43cc692f5a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.144155 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92e49b3-b2d4-41f0-9933-ea43cc692f5a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.144416 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="e92e49b3-b2d4-41f0-9933-ea43cc692f5a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.145085 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.147643 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.147883 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.149448 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.149758 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.149919 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.149965 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.158851 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m"] Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.194919 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.195202 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.195739 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.195959 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzpc6\" (UniqueName: \"kubernetes.io/projected/5248ec5f-6231-40a3-be9d-815bdf5ec259-kube-api-access-xzpc6\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.196175 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.196257 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.298306 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.298562 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.298615 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.298646 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzpc6\" (UniqueName: \"kubernetes.io/projected/5248ec5f-6231-40a3-be9d-815bdf5ec259-kube-api-access-xzpc6\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.298686 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.298707 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.306435 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.306440 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.306659 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.308800 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.310820 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.322002 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzpc6\" (UniqueName: \"kubernetes.io/projected/5248ec5f-6231-40a3-be9d-815bdf5ec259-kube-api-access-xzpc6\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znc8m\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.462094 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.889292 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m"] Oct 01 13:20:13 crc kubenswrapper[4913]: I1001 13:20:13.969489 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" event={"ID":"5248ec5f-6231-40a3-be9d-815bdf5ec259","Type":"ContainerStarted","Data":"69126c6b3873c6d3237a89a0d6b13115a8a536f97e35f964032eecb646461783"} Oct 01 13:20:14 crc kubenswrapper[4913]: I1001 13:20:14.980564 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" event={"ID":"5248ec5f-6231-40a3-be9d-815bdf5ec259","Type":"ContainerStarted","Data":"e599f071f4931b15fbb327f1a3b2ebc822434f65c67d3eb0730958802349bdf8"} Oct 01 13:21:40 crc kubenswrapper[4913]: I1001 13:21:40.083917 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:21:40 crc kubenswrapper[4913]: I1001 13:21:40.084589 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:21:57 crc kubenswrapper[4913]: I1001 13:21:57.350447 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" podStartSLOduration=103.787096075 podStartE2EDuration="1m44.350423868s" podCreationTimestamp="2025-10-01 13:20:13 +0000 UTC" firstStartedPulling="2025-10-01 13:20:13.896489246 +0000 UTC m=+2545.799964824" lastFinishedPulling="2025-10-01 13:20:14.459817039 +0000 UTC m=+2546.363292617" observedRunningTime="2025-10-01 13:20:14.996856389 +0000 UTC m=+2546.900331987" watchObservedRunningTime="2025-10-01 13:21:57.350423868 +0000 UTC m=+2649.253899486" Oct 01 13:21:57 crc kubenswrapper[4913]: I1001 13:21:57.367422 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sgqjt"] Oct 01 13:21:57 crc kubenswrapper[4913]: I1001 13:21:57.369884 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:21:57 crc kubenswrapper[4913]: I1001 13:21:57.410538 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sgqjt"] Oct 01 13:21:57 crc kubenswrapper[4913]: I1001 13:21:57.509281 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-utilities\") pod \"community-operators-sgqjt\" (UID: \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\") " pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:21:57 crc kubenswrapper[4913]: I1001 13:21:57.509320 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-catalog-content\") pod \"community-operators-sgqjt\" (UID: \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\") " pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:21:57 crc kubenswrapper[4913]: I1001 13:21:57.509455 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j48rt\" (UniqueName: \"kubernetes.io/projected/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-kube-api-access-j48rt\") pod \"community-operators-sgqjt\" (UID: \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\") " pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:21:57 crc kubenswrapper[4913]: I1001 13:21:57.610909 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-utilities\") pod \"community-operators-sgqjt\" (UID: \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\") " pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:21:57 crc kubenswrapper[4913]: I1001 13:21:57.610982 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-catalog-content\") pod \"community-operators-sgqjt\" (UID: \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\") " pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:21:57 crc kubenswrapper[4913]: I1001 13:21:57.611072 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j48rt\" (UniqueName: \"kubernetes.io/projected/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-kube-api-access-j48rt\") pod \"community-operators-sgqjt\" (UID: \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\") " pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:21:57 crc kubenswrapper[4913]: I1001 13:21:57.612048 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-utilities\") pod \"community-operators-sgqjt\" (UID: \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\") " pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:21:57 crc kubenswrapper[4913]: I1001 13:21:57.612437 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-catalog-content\") pod \"community-operators-sgqjt\" (UID: \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\") " pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:21:57 crc kubenswrapper[4913]: I1001 13:21:57.633361 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j48rt\" (UniqueName: \"kubernetes.io/projected/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-kube-api-access-j48rt\") pod \"community-operators-sgqjt\" (UID: \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\") " pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:21:57 crc kubenswrapper[4913]: I1001 13:21:57.732651 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:21:58 crc kubenswrapper[4913]: I1001 13:21:58.238001 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sgqjt"] Oct 01 13:21:58 crc kubenswrapper[4913]: I1001 13:21:58.938555 4913 generic.go:334] "Generic (PLEG): container finished" podID="9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" containerID="2e3737ad0cb90b120847b60960e2d3fbcdf2b73f734b6cd1aa6d27f7e39ff98f" exitCode=0 Oct 01 13:21:58 crc kubenswrapper[4913]: I1001 13:21:58.938625 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sgqjt" event={"ID":"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59","Type":"ContainerDied","Data":"2e3737ad0cb90b120847b60960e2d3fbcdf2b73f734b6cd1aa6d27f7e39ff98f"} Oct 01 13:21:58 crc kubenswrapper[4913]: I1001 13:21:58.938664 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sgqjt" event={"ID":"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59","Type":"ContainerStarted","Data":"45fc9f1c8c72abdaf193070a175dde6ccffa7ae6f0251d462d4b4520151cf377"} Oct 01 13:22:00 crc kubenswrapper[4913]: I1001 13:22:00.962062 4913 generic.go:334] "Generic (PLEG): container finished" podID="9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" containerID="e73ec1cba6ad349ed69d825b87a7daca9fae03d4ff90466f6d3a4bbf27235f6d" exitCode=0 Oct 01 13:22:00 crc kubenswrapper[4913]: I1001 13:22:00.962145 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sgqjt" event={"ID":"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59","Type":"ContainerDied","Data":"e73ec1cba6ad349ed69d825b87a7daca9fae03d4ff90466f6d3a4bbf27235f6d"} Oct 01 13:22:01 crc kubenswrapper[4913]: I1001 13:22:01.974514 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sgqjt" event={"ID":"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59","Type":"ContainerStarted","Data":"e63eb78605c2f0c97103ced2cfb715de5c636161c60204e6b790c6f4f7ee7eac"} Oct 01 13:22:02 crc kubenswrapper[4913]: I1001 13:22:02.005852 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sgqjt" podStartSLOduration=2.445270827 podStartE2EDuration="5.0058308s" podCreationTimestamp="2025-10-01 13:21:57 +0000 UTC" firstStartedPulling="2025-10-01 13:21:58.941292874 +0000 UTC m=+2650.844768472" lastFinishedPulling="2025-10-01 13:22:01.501852857 +0000 UTC m=+2653.405328445" observedRunningTime="2025-10-01 13:22:01.993856652 +0000 UTC m=+2653.897332320" watchObservedRunningTime="2025-10-01 13:22:02.0058308 +0000 UTC m=+2653.909306378" Oct 01 13:22:07 crc kubenswrapper[4913]: I1001 13:22:07.733130 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:22:07 crc kubenswrapper[4913]: I1001 13:22:07.733974 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:22:07 crc kubenswrapper[4913]: I1001 13:22:07.817285 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:22:08 crc kubenswrapper[4913]: I1001 13:22:08.129392 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:22:08 crc kubenswrapper[4913]: I1001 13:22:08.173085 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sgqjt"] Oct 01 13:22:10 crc kubenswrapper[4913]: I1001 13:22:10.083254 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:22:10 crc kubenswrapper[4913]: I1001 13:22:10.083595 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:22:10 crc kubenswrapper[4913]: I1001 13:22:10.097052 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sgqjt" podUID="9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" containerName="registry-server" containerID="cri-o://e63eb78605c2f0c97103ced2cfb715de5c636161c60204e6b790c6f4f7ee7eac" gracePeriod=2 Oct 01 13:22:10 crc kubenswrapper[4913]: I1001 13:22:10.524967 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:22:10 crc kubenswrapper[4913]: I1001 13:22:10.571216 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j48rt\" (UniqueName: \"kubernetes.io/projected/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-kube-api-access-j48rt\") pod \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\" (UID: \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\") " Oct 01 13:22:10 crc kubenswrapper[4913]: I1001 13:22:10.571295 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-catalog-content\") pod \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\" (UID: \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\") " Oct 01 13:22:10 crc kubenswrapper[4913]: I1001 13:22:10.571426 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-utilities\") pod \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\" (UID: \"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59\") " Oct 01 13:22:10 crc kubenswrapper[4913]: I1001 13:22:10.572727 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-utilities" (OuterVolumeSpecName: "utilities") pod "9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" (UID: "9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:22:10 crc kubenswrapper[4913]: I1001 13:22:10.577594 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-kube-api-access-j48rt" (OuterVolumeSpecName: "kube-api-access-j48rt") pod "9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" (UID: "9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59"). InnerVolumeSpecName "kube-api-access-j48rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:22:10 crc kubenswrapper[4913]: I1001 13:22:10.647517 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" (UID: "9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:22:10 crc kubenswrapper[4913]: I1001 13:22:10.673719 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j48rt\" (UniqueName: \"kubernetes.io/projected/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-kube-api-access-j48rt\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:10 crc kubenswrapper[4913]: I1001 13:22:10.673746 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:10 crc kubenswrapper[4913]: I1001 13:22:10.673757 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.118391 4913 generic.go:334] "Generic (PLEG): container finished" podID="9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" containerID="e63eb78605c2f0c97103ced2cfb715de5c636161c60204e6b790c6f4f7ee7eac" exitCode=0 Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.118449 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sgqjt" event={"ID":"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59","Type":"ContainerDied","Data":"e63eb78605c2f0c97103ced2cfb715de5c636161c60204e6b790c6f4f7ee7eac"} Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.118497 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sgqjt" event={"ID":"9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59","Type":"ContainerDied","Data":"45fc9f1c8c72abdaf193070a175dde6ccffa7ae6f0251d462d4b4520151cf377"} Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.118517 4913 scope.go:117] "RemoveContainer" containerID="e63eb78605c2f0c97103ced2cfb715de5c636161c60204e6b790c6f4f7ee7eac" Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.118734 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sgqjt" Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.159295 4913 scope.go:117] "RemoveContainer" containerID="e73ec1cba6ad349ed69d825b87a7daca9fae03d4ff90466f6d3a4bbf27235f6d" Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.172456 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sgqjt"] Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.182759 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sgqjt"] Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.191950 4913 scope.go:117] "RemoveContainer" containerID="2e3737ad0cb90b120847b60960e2d3fbcdf2b73f734b6cd1aa6d27f7e39ff98f" Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.245743 4913 scope.go:117] "RemoveContainer" containerID="e63eb78605c2f0c97103ced2cfb715de5c636161c60204e6b790c6f4f7ee7eac" Oct 01 13:22:11 crc kubenswrapper[4913]: E1001 13:22:11.247115 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e63eb78605c2f0c97103ced2cfb715de5c636161c60204e6b790c6f4f7ee7eac\": container with ID starting with e63eb78605c2f0c97103ced2cfb715de5c636161c60204e6b790c6f4f7ee7eac not found: ID does not exist" containerID="e63eb78605c2f0c97103ced2cfb715de5c636161c60204e6b790c6f4f7ee7eac" Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.247161 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e63eb78605c2f0c97103ced2cfb715de5c636161c60204e6b790c6f4f7ee7eac"} err="failed to get container status \"e63eb78605c2f0c97103ced2cfb715de5c636161c60204e6b790c6f4f7ee7eac\": rpc error: code = NotFound desc = could not find container \"e63eb78605c2f0c97103ced2cfb715de5c636161c60204e6b790c6f4f7ee7eac\": container with ID starting with e63eb78605c2f0c97103ced2cfb715de5c636161c60204e6b790c6f4f7ee7eac not found: ID does not exist" Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.247192 4913 scope.go:117] "RemoveContainer" containerID="e73ec1cba6ad349ed69d825b87a7daca9fae03d4ff90466f6d3a4bbf27235f6d" Oct 01 13:22:11 crc kubenswrapper[4913]: E1001 13:22:11.247620 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e73ec1cba6ad349ed69d825b87a7daca9fae03d4ff90466f6d3a4bbf27235f6d\": container with ID starting with e73ec1cba6ad349ed69d825b87a7daca9fae03d4ff90466f6d3a4bbf27235f6d not found: ID does not exist" containerID="e73ec1cba6ad349ed69d825b87a7daca9fae03d4ff90466f6d3a4bbf27235f6d" Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.247655 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73ec1cba6ad349ed69d825b87a7daca9fae03d4ff90466f6d3a4bbf27235f6d"} err="failed to get container status \"e73ec1cba6ad349ed69d825b87a7daca9fae03d4ff90466f6d3a4bbf27235f6d\": rpc error: code = NotFound desc = could not find container \"e73ec1cba6ad349ed69d825b87a7daca9fae03d4ff90466f6d3a4bbf27235f6d\": container with ID starting with e73ec1cba6ad349ed69d825b87a7daca9fae03d4ff90466f6d3a4bbf27235f6d not found: ID does not exist" Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.247688 4913 scope.go:117] "RemoveContainer" containerID="2e3737ad0cb90b120847b60960e2d3fbcdf2b73f734b6cd1aa6d27f7e39ff98f" Oct 01 13:22:11 crc kubenswrapper[4913]: E1001 13:22:11.248117 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e3737ad0cb90b120847b60960e2d3fbcdf2b73f734b6cd1aa6d27f7e39ff98f\": container with ID starting with 2e3737ad0cb90b120847b60960e2d3fbcdf2b73f734b6cd1aa6d27f7e39ff98f not found: ID does not exist" containerID="2e3737ad0cb90b120847b60960e2d3fbcdf2b73f734b6cd1aa6d27f7e39ff98f" Oct 01 13:22:11 crc kubenswrapper[4913]: I1001 13:22:11.248138 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e3737ad0cb90b120847b60960e2d3fbcdf2b73f734b6cd1aa6d27f7e39ff98f"} err="failed to get container status \"2e3737ad0cb90b120847b60960e2d3fbcdf2b73f734b6cd1aa6d27f7e39ff98f\": rpc error: code = NotFound desc = could not find container \"2e3737ad0cb90b120847b60960e2d3fbcdf2b73f734b6cd1aa6d27f7e39ff98f\": container with ID starting with 2e3737ad0cb90b120847b60960e2d3fbcdf2b73f734b6cd1aa6d27f7e39ff98f not found: ID does not exist" Oct 01 13:22:12 crc kubenswrapper[4913]: I1001 13:22:12.823162 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" path="/var/lib/kubelet/pods/9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59/volumes" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.528655 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mzg5l"] Oct 01 13:22:18 crc kubenswrapper[4913]: E1001 13:22:18.529960 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" containerName="extract-content" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.529983 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" containerName="extract-content" Oct 01 13:22:18 crc kubenswrapper[4913]: E1001 13:22:18.530027 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" containerName="registry-server" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.530039 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" containerName="registry-server" Oct 01 13:22:18 crc kubenswrapper[4913]: E1001 13:22:18.530078 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" containerName="extract-utilities" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.530092 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" containerName="extract-utilities" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.530453 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="9361b7a6-7c55-42ea-b5c1-e76f8c9a2b59" containerName="registry-server" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.533405 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.558041 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzg5l"] Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.633701 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcstd\" (UniqueName: \"kubernetes.io/projected/18bdfd02-b89e-4c71-8188-f371dd143f59-kube-api-access-vcstd\") pod \"redhat-marketplace-mzg5l\" (UID: \"18bdfd02-b89e-4c71-8188-f371dd143f59\") " pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.633919 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18bdfd02-b89e-4c71-8188-f371dd143f59-utilities\") pod \"redhat-marketplace-mzg5l\" (UID: \"18bdfd02-b89e-4c71-8188-f371dd143f59\") " pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.634002 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18bdfd02-b89e-4c71-8188-f371dd143f59-catalog-content\") pod \"redhat-marketplace-mzg5l\" (UID: \"18bdfd02-b89e-4c71-8188-f371dd143f59\") " pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.735805 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcstd\" (UniqueName: \"kubernetes.io/projected/18bdfd02-b89e-4c71-8188-f371dd143f59-kube-api-access-vcstd\") pod \"redhat-marketplace-mzg5l\" (UID: \"18bdfd02-b89e-4c71-8188-f371dd143f59\") " pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.736128 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18bdfd02-b89e-4c71-8188-f371dd143f59-utilities\") pod \"redhat-marketplace-mzg5l\" (UID: \"18bdfd02-b89e-4c71-8188-f371dd143f59\") " pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.736158 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18bdfd02-b89e-4c71-8188-f371dd143f59-catalog-content\") pod \"redhat-marketplace-mzg5l\" (UID: \"18bdfd02-b89e-4c71-8188-f371dd143f59\") " pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.736679 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18bdfd02-b89e-4c71-8188-f371dd143f59-catalog-content\") pod \"redhat-marketplace-mzg5l\" (UID: \"18bdfd02-b89e-4c71-8188-f371dd143f59\") " pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.736791 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18bdfd02-b89e-4c71-8188-f371dd143f59-utilities\") pod \"redhat-marketplace-mzg5l\" (UID: \"18bdfd02-b89e-4c71-8188-f371dd143f59\") " pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.761875 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcstd\" (UniqueName: \"kubernetes.io/projected/18bdfd02-b89e-4c71-8188-f371dd143f59-kube-api-access-vcstd\") pod \"redhat-marketplace-mzg5l\" (UID: \"18bdfd02-b89e-4c71-8188-f371dd143f59\") " pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:18 crc kubenswrapper[4913]: I1001 13:22:18.852554 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:19 crc kubenswrapper[4913]: I1001 13:22:19.284527 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzg5l"] Oct 01 13:22:20 crc kubenswrapper[4913]: I1001 13:22:20.208540 4913 generic.go:334] "Generic (PLEG): container finished" podID="18bdfd02-b89e-4c71-8188-f371dd143f59" containerID="6b60171e97ef154e81bff02d0f3c043f0f55038bdf70aa2903dcd6a0cead2ec1" exitCode=0 Oct 01 13:22:20 crc kubenswrapper[4913]: I1001 13:22:20.208637 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzg5l" event={"ID":"18bdfd02-b89e-4c71-8188-f371dd143f59","Type":"ContainerDied","Data":"6b60171e97ef154e81bff02d0f3c043f0f55038bdf70aa2903dcd6a0cead2ec1"} Oct 01 13:22:20 crc kubenswrapper[4913]: I1001 13:22:20.209074 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzg5l" event={"ID":"18bdfd02-b89e-4c71-8188-f371dd143f59","Type":"ContainerStarted","Data":"8d89b914c7a4e5161ab9c3dfa8fdf74d028e7a386672d113bf05db2dc4f7ca0a"} Oct 01 13:22:22 crc kubenswrapper[4913]: I1001 13:22:22.236022 4913 generic.go:334] "Generic (PLEG): container finished" podID="18bdfd02-b89e-4c71-8188-f371dd143f59" containerID="db0a0d1fe341b2f8fbb456a5ec79d18860ba505f0ea3802f9d4d5bb67f770c9a" exitCode=0 Oct 01 13:22:22 crc kubenswrapper[4913]: I1001 13:22:22.236303 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzg5l" event={"ID":"18bdfd02-b89e-4c71-8188-f371dd143f59","Type":"ContainerDied","Data":"db0a0d1fe341b2f8fbb456a5ec79d18860ba505f0ea3802f9d4d5bb67f770c9a"} Oct 01 13:22:23 crc kubenswrapper[4913]: I1001 13:22:23.249015 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzg5l" event={"ID":"18bdfd02-b89e-4c71-8188-f371dd143f59","Type":"ContainerStarted","Data":"2d90fdee9e02644a40403fcae2fe4e8b50d81ef8eb52899d11513738abcc6d68"} Oct 01 13:22:28 crc kubenswrapper[4913]: I1001 13:22:28.852799 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:28 crc kubenswrapper[4913]: I1001 13:22:28.853493 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:28 crc kubenswrapper[4913]: I1001 13:22:28.913914 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:28 crc kubenswrapper[4913]: I1001 13:22:28.941631 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mzg5l" podStartSLOduration=8.21786851 podStartE2EDuration="10.94160978s" podCreationTimestamp="2025-10-01 13:22:18 +0000 UTC" firstStartedPulling="2025-10-01 13:22:20.209917764 +0000 UTC m=+2672.113393342" lastFinishedPulling="2025-10-01 13:22:22.933658994 +0000 UTC m=+2674.837134612" observedRunningTime="2025-10-01 13:22:23.286733215 +0000 UTC m=+2675.190208793" watchObservedRunningTime="2025-10-01 13:22:28.94160978 +0000 UTC m=+2680.845085358" Oct 01 13:22:29 crc kubenswrapper[4913]: I1001 13:22:29.348568 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:29 crc kubenswrapper[4913]: I1001 13:22:29.401035 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzg5l"] Oct 01 13:22:31 crc kubenswrapper[4913]: I1001 13:22:31.323542 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mzg5l" podUID="18bdfd02-b89e-4c71-8188-f371dd143f59" containerName="registry-server" containerID="cri-o://2d90fdee9e02644a40403fcae2fe4e8b50d81ef8eb52899d11513738abcc6d68" gracePeriod=2 Oct 01 13:22:31 crc kubenswrapper[4913]: I1001 13:22:31.742913 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:31 crc kubenswrapper[4913]: I1001 13:22:31.884419 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18bdfd02-b89e-4c71-8188-f371dd143f59-catalog-content\") pod \"18bdfd02-b89e-4c71-8188-f371dd143f59\" (UID: \"18bdfd02-b89e-4c71-8188-f371dd143f59\") " Oct 01 13:22:31 crc kubenswrapper[4913]: I1001 13:22:31.884780 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18bdfd02-b89e-4c71-8188-f371dd143f59-utilities\") pod \"18bdfd02-b89e-4c71-8188-f371dd143f59\" (UID: \"18bdfd02-b89e-4c71-8188-f371dd143f59\") " Oct 01 13:22:31 crc kubenswrapper[4913]: I1001 13:22:31.884893 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcstd\" (UniqueName: \"kubernetes.io/projected/18bdfd02-b89e-4c71-8188-f371dd143f59-kube-api-access-vcstd\") pod \"18bdfd02-b89e-4c71-8188-f371dd143f59\" (UID: \"18bdfd02-b89e-4c71-8188-f371dd143f59\") " Oct 01 13:22:31 crc kubenswrapper[4913]: I1001 13:22:31.886382 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18bdfd02-b89e-4c71-8188-f371dd143f59-utilities" (OuterVolumeSpecName: "utilities") pod "18bdfd02-b89e-4c71-8188-f371dd143f59" (UID: "18bdfd02-b89e-4c71-8188-f371dd143f59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:22:31 crc kubenswrapper[4913]: I1001 13:22:31.892342 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18bdfd02-b89e-4c71-8188-f371dd143f59-kube-api-access-vcstd" (OuterVolumeSpecName: "kube-api-access-vcstd") pod "18bdfd02-b89e-4c71-8188-f371dd143f59" (UID: "18bdfd02-b89e-4c71-8188-f371dd143f59"). InnerVolumeSpecName "kube-api-access-vcstd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:22:31 crc kubenswrapper[4913]: I1001 13:22:31.901433 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18bdfd02-b89e-4c71-8188-f371dd143f59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18bdfd02-b89e-4c71-8188-f371dd143f59" (UID: "18bdfd02-b89e-4c71-8188-f371dd143f59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:22:31 crc kubenswrapper[4913]: I1001 13:22:31.987180 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18bdfd02-b89e-4c71-8188-f371dd143f59-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:31 crc kubenswrapper[4913]: I1001 13:22:31.987247 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcstd\" (UniqueName: \"kubernetes.io/projected/18bdfd02-b89e-4c71-8188-f371dd143f59-kube-api-access-vcstd\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:31 crc kubenswrapper[4913]: I1001 13:22:31.987274 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18bdfd02-b89e-4c71-8188-f371dd143f59-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.333845 4913 generic.go:334] "Generic (PLEG): container finished" podID="18bdfd02-b89e-4c71-8188-f371dd143f59" containerID="2d90fdee9e02644a40403fcae2fe4e8b50d81ef8eb52899d11513738abcc6d68" exitCode=0 Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.333893 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzg5l" event={"ID":"18bdfd02-b89e-4c71-8188-f371dd143f59","Type":"ContainerDied","Data":"2d90fdee9e02644a40403fcae2fe4e8b50d81ef8eb52899d11513738abcc6d68"} Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.333925 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzg5l" event={"ID":"18bdfd02-b89e-4c71-8188-f371dd143f59","Type":"ContainerDied","Data":"8d89b914c7a4e5161ab9c3dfa8fdf74d028e7a386672d113bf05db2dc4f7ca0a"} Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.333946 4913 scope.go:117] "RemoveContainer" containerID="2d90fdee9e02644a40403fcae2fe4e8b50d81ef8eb52899d11513738abcc6d68" Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.333980 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzg5l" Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.372406 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzg5l"] Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.382423 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzg5l"] Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.398463 4913 scope.go:117] "RemoveContainer" containerID="db0a0d1fe341b2f8fbb456a5ec79d18860ba505f0ea3802f9d4d5bb67f770c9a" Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.423528 4913 scope.go:117] "RemoveContainer" containerID="6b60171e97ef154e81bff02d0f3c043f0f55038bdf70aa2903dcd6a0cead2ec1" Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.470369 4913 scope.go:117] "RemoveContainer" containerID="2d90fdee9e02644a40403fcae2fe4e8b50d81ef8eb52899d11513738abcc6d68" Oct 01 13:22:32 crc kubenswrapper[4913]: E1001 13:22:32.470856 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d90fdee9e02644a40403fcae2fe4e8b50d81ef8eb52899d11513738abcc6d68\": container with ID starting with 2d90fdee9e02644a40403fcae2fe4e8b50d81ef8eb52899d11513738abcc6d68 not found: ID does not exist" containerID="2d90fdee9e02644a40403fcae2fe4e8b50d81ef8eb52899d11513738abcc6d68" Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.470894 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d90fdee9e02644a40403fcae2fe4e8b50d81ef8eb52899d11513738abcc6d68"} err="failed to get container status \"2d90fdee9e02644a40403fcae2fe4e8b50d81ef8eb52899d11513738abcc6d68\": rpc error: code = NotFound desc = could not find container \"2d90fdee9e02644a40403fcae2fe4e8b50d81ef8eb52899d11513738abcc6d68\": container with ID starting with 2d90fdee9e02644a40403fcae2fe4e8b50d81ef8eb52899d11513738abcc6d68 not found: ID does not exist" Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.470917 4913 scope.go:117] "RemoveContainer" containerID="db0a0d1fe341b2f8fbb456a5ec79d18860ba505f0ea3802f9d4d5bb67f770c9a" Oct 01 13:22:32 crc kubenswrapper[4913]: E1001 13:22:32.471508 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db0a0d1fe341b2f8fbb456a5ec79d18860ba505f0ea3802f9d4d5bb67f770c9a\": container with ID starting with db0a0d1fe341b2f8fbb456a5ec79d18860ba505f0ea3802f9d4d5bb67f770c9a not found: ID does not exist" containerID="db0a0d1fe341b2f8fbb456a5ec79d18860ba505f0ea3802f9d4d5bb67f770c9a" Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.471533 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db0a0d1fe341b2f8fbb456a5ec79d18860ba505f0ea3802f9d4d5bb67f770c9a"} err="failed to get container status \"db0a0d1fe341b2f8fbb456a5ec79d18860ba505f0ea3802f9d4d5bb67f770c9a\": rpc error: code = NotFound desc = could not find container \"db0a0d1fe341b2f8fbb456a5ec79d18860ba505f0ea3802f9d4d5bb67f770c9a\": container with ID starting with db0a0d1fe341b2f8fbb456a5ec79d18860ba505f0ea3802f9d4d5bb67f770c9a not found: ID does not exist" Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.471553 4913 scope.go:117] "RemoveContainer" containerID="6b60171e97ef154e81bff02d0f3c043f0f55038bdf70aa2903dcd6a0cead2ec1" Oct 01 13:22:32 crc kubenswrapper[4913]: E1001 13:22:32.471826 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b60171e97ef154e81bff02d0f3c043f0f55038bdf70aa2903dcd6a0cead2ec1\": container with ID starting with 6b60171e97ef154e81bff02d0f3c043f0f55038bdf70aa2903dcd6a0cead2ec1 not found: ID does not exist" containerID="6b60171e97ef154e81bff02d0f3c043f0f55038bdf70aa2903dcd6a0cead2ec1" Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.471853 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b60171e97ef154e81bff02d0f3c043f0f55038bdf70aa2903dcd6a0cead2ec1"} err="failed to get container status \"6b60171e97ef154e81bff02d0f3c043f0f55038bdf70aa2903dcd6a0cead2ec1\": rpc error: code = NotFound desc = could not find container \"6b60171e97ef154e81bff02d0f3c043f0f55038bdf70aa2903dcd6a0cead2ec1\": container with ID starting with 6b60171e97ef154e81bff02d0f3c043f0f55038bdf70aa2903dcd6a0cead2ec1 not found: ID does not exist" Oct 01 13:22:32 crc kubenswrapper[4913]: I1001 13:22:32.833171 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18bdfd02-b89e-4c71-8188-f371dd143f59" path="/var/lib/kubelet/pods/18bdfd02-b89e-4c71-8188-f371dd143f59/volumes" Oct 01 13:22:40 crc kubenswrapper[4913]: I1001 13:22:40.083810 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:22:40 crc kubenswrapper[4913]: I1001 13:22:40.084829 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:22:40 crc kubenswrapper[4913]: I1001 13:22:40.084928 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 13:22:40 crc kubenswrapper[4913]: I1001 13:22:40.086774 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b76ae050c378f5829f6b759e6437f37d875093ba22902111d9211cf256e9b0f"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:22:40 crc kubenswrapper[4913]: I1001 13:22:40.087206 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://1b76ae050c378f5829f6b759e6437f37d875093ba22902111d9211cf256e9b0f" gracePeriod=600 Oct 01 13:22:40 crc kubenswrapper[4913]: I1001 13:22:40.413189 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="1b76ae050c378f5829f6b759e6437f37d875093ba22902111d9211cf256e9b0f" exitCode=0 Oct 01 13:22:40 crc kubenswrapper[4913]: I1001 13:22:40.413291 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"1b76ae050c378f5829f6b759e6437f37d875093ba22902111d9211cf256e9b0f"} Oct 01 13:22:40 crc kubenswrapper[4913]: I1001 13:22:40.413601 4913 scope.go:117] "RemoveContainer" containerID="49e7251c02dbe69d86929a21422f0ef94a42212a25ea879e844ce215a59232e5" Oct 01 13:22:41 crc kubenswrapper[4913]: I1001 13:22:41.429467 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6"} Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.451884 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p5nxw"] Oct 01 13:23:28 crc kubenswrapper[4913]: E1001 13:23:28.453213 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18bdfd02-b89e-4c71-8188-f371dd143f59" containerName="extract-content" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.453259 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="18bdfd02-b89e-4c71-8188-f371dd143f59" containerName="extract-content" Oct 01 13:23:28 crc kubenswrapper[4913]: E1001 13:23:28.453307 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18bdfd02-b89e-4c71-8188-f371dd143f59" containerName="extract-utilities" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.453318 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="18bdfd02-b89e-4c71-8188-f371dd143f59" containerName="extract-utilities" Oct 01 13:23:28 crc kubenswrapper[4913]: E1001 13:23:28.453379 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18bdfd02-b89e-4c71-8188-f371dd143f59" containerName="registry-server" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.453387 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="18bdfd02-b89e-4c71-8188-f371dd143f59" containerName="registry-server" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.453790 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="18bdfd02-b89e-4c71-8188-f371dd143f59" containerName="registry-server" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.456177 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.496447 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p5nxw"] Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.550056 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46ch7\" (UniqueName: \"kubernetes.io/projected/d6e4192f-512f-4d3f-8730-51c2ea55be13-kube-api-access-46ch7\") pod \"certified-operators-p5nxw\" (UID: \"d6e4192f-512f-4d3f-8730-51c2ea55be13\") " pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.550109 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e4192f-512f-4d3f-8730-51c2ea55be13-catalog-content\") pod \"certified-operators-p5nxw\" (UID: \"d6e4192f-512f-4d3f-8730-51c2ea55be13\") " pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.550162 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e4192f-512f-4d3f-8730-51c2ea55be13-utilities\") pod \"certified-operators-p5nxw\" (UID: \"d6e4192f-512f-4d3f-8730-51c2ea55be13\") " pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.653473 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e4192f-512f-4d3f-8730-51c2ea55be13-utilities\") pod \"certified-operators-p5nxw\" (UID: \"d6e4192f-512f-4d3f-8730-51c2ea55be13\") " pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.653945 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46ch7\" (UniqueName: \"kubernetes.io/projected/d6e4192f-512f-4d3f-8730-51c2ea55be13-kube-api-access-46ch7\") pod \"certified-operators-p5nxw\" (UID: \"d6e4192f-512f-4d3f-8730-51c2ea55be13\") " pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.654025 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e4192f-512f-4d3f-8730-51c2ea55be13-catalog-content\") pod \"certified-operators-p5nxw\" (UID: \"d6e4192f-512f-4d3f-8730-51c2ea55be13\") " pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.654115 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e4192f-512f-4d3f-8730-51c2ea55be13-utilities\") pod \"certified-operators-p5nxw\" (UID: \"d6e4192f-512f-4d3f-8730-51c2ea55be13\") " pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.655110 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e4192f-512f-4d3f-8730-51c2ea55be13-catalog-content\") pod \"certified-operators-p5nxw\" (UID: \"d6e4192f-512f-4d3f-8730-51c2ea55be13\") " pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.684020 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46ch7\" (UniqueName: \"kubernetes.io/projected/d6e4192f-512f-4d3f-8730-51c2ea55be13-kube-api-access-46ch7\") pod \"certified-operators-p5nxw\" (UID: \"d6e4192f-512f-4d3f-8730-51c2ea55be13\") " pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:28 crc kubenswrapper[4913]: I1001 13:23:28.846883 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:29 crc kubenswrapper[4913]: I1001 13:23:29.302342 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p5nxw"] Oct 01 13:23:29 crc kubenswrapper[4913]: I1001 13:23:29.860605 4913 generic.go:334] "Generic (PLEG): container finished" podID="d6e4192f-512f-4d3f-8730-51c2ea55be13" containerID="ddcd7badb1923d5a172bdbc7c84ab0bd765522ccec9194ffeb9db5fd96d8f9c7" exitCode=0 Oct 01 13:23:29 crc kubenswrapper[4913]: I1001 13:23:29.860649 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5nxw" event={"ID":"d6e4192f-512f-4d3f-8730-51c2ea55be13","Type":"ContainerDied","Data":"ddcd7badb1923d5a172bdbc7c84ab0bd765522ccec9194ffeb9db5fd96d8f9c7"} Oct 01 13:23:29 crc kubenswrapper[4913]: I1001 13:23:29.860675 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5nxw" event={"ID":"d6e4192f-512f-4d3f-8730-51c2ea55be13","Type":"ContainerStarted","Data":"4b419531136aa081faf21b0b08aae2741e6d25f1d5fd922aec574021e633bf84"} Oct 01 13:23:29 crc kubenswrapper[4913]: I1001 13:23:29.863073 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:23:31 crc kubenswrapper[4913]: I1001 13:23:31.883851 4913 generic.go:334] "Generic (PLEG): container finished" podID="d6e4192f-512f-4d3f-8730-51c2ea55be13" containerID="5c5b04594eb60cd280405176e2d120dbcf3b9d48567969cd9541702b2609daa4" exitCode=0 Oct 01 13:23:31 crc kubenswrapper[4913]: I1001 13:23:31.883945 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5nxw" event={"ID":"d6e4192f-512f-4d3f-8730-51c2ea55be13","Type":"ContainerDied","Data":"5c5b04594eb60cd280405176e2d120dbcf3b9d48567969cd9541702b2609daa4"} Oct 01 13:23:32 crc kubenswrapper[4913]: I1001 13:23:32.906827 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5nxw" event={"ID":"d6e4192f-512f-4d3f-8730-51c2ea55be13","Type":"ContainerStarted","Data":"d3c70f7cee5b6b727e95a2c30f8a0b98103a52630476c83b5f43d27516ea97cc"} Oct 01 13:23:32 crc kubenswrapper[4913]: I1001 13:23:32.941792 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p5nxw" podStartSLOduration=2.437424954 podStartE2EDuration="4.941773656s" podCreationTimestamp="2025-10-01 13:23:28 +0000 UTC" firstStartedPulling="2025-10-01 13:23:29.862861189 +0000 UTC m=+2741.766336767" lastFinishedPulling="2025-10-01 13:23:32.367209881 +0000 UTC m=+2744.270685469" observedRunningTime="2025-10-01 13:23:32.938814586 +0000 UTC m=+2744.842290184" watchObservedRunningTime="2025-10-01 13:23:32.941773656 +0000 UTC m=+2744.845249244" Oct 01 13:23:38 crc kubenswrapper[4913]: I1001 13:23:38.847497 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:38 crc kubenswrapper[4913]: I1001 13:23:38.848148 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:38 crc kubenswrapper[4913]: I1001 13:23:38.913336 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:39 crc kubenswrapper[4913]: I1001 13:23:39.014429 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:39 crc kubenswrapper[4913]: I1001 13:23:39.149113 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p5nxw"] Oct 01 13:23:40 crc kubenswrapper[4913]: I1001 13:23:40.976020 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p5nxw" podUID="d6e4192f-512f-4d3f-8730-51c2ea55be13" containerName="registry-server" containerID="cri-o://d3c70f7cee5b6b727e95a2c30f8a0b98103a52630476c83b5f43d27516ea97cc" gracePeriod=2 Oct 01 13:23:41 crc kubenswrapper[4913]: I1001 13:23:41.423972 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:41 crc kubenswrapper[4913]: I1001 13:23:41.498995 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e4192f-512f-4d3f-8730-51c2ea55be13-catalog-content\") pod \"d6e4192f-512f-4d3f-8730-51c2ea55be13\" (UID: \"d6e4192f-512f-4d3f-8730-51c2ea55be13\") " Oct 01 13:23:41 crc kubenswrapper[4913]: I1001 13:23:41.499097 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e4192f-512f-4d3f-8730-51c2ea55be13-utilities\") pod \"d6e4192f-512f-4d3f-8730-51c2ea55be13\" (UID: \"d6e4192f-512f-4d3f-8730-51c2ea55be13\") " Oct 01 13:23:41 crc kubenswrapper[4913]: I1001 13:23:41.499222 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46ch7\" (UniqueName: \"kubernetes.io/projected/d6e4192f-512f-4d3f-8730-51c2ea55be13-kube-api-access-46ch7\") pod \"d6e4192f-512f-4d3f-8730-51c2ea55be13\" (UID: \"d6e4192f-512f-4d3f-8730-51c2ea55be13\") " Oct 01 13:23:41 crc kubenswrapper[4913]: I1001 13:23:41.501009 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e4192f-512f-4d3f-8730-51c2ea55be13-utilities" (OuterVolumeSpecName: "utilities") pod "d6e4192f-512f-4d3f-8730-51c2ea55be13" (UID: "d6e4192f-512f-4d3f-8730-51c2ea55be13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:23:41 crc kubenswrapper[4913]: I1001 13:23:41.508482 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e4192f-512f-4d3f-8730-51c2ea55be13-kube-api-access-46ch7" (OuterVolumeSpecName: "kube-api-access-46ch7") pod "d6e4192f-512f-4d3f-8730-51c2ea55be13" (UID: "d6e4192f-512f-4d3f-8730-51c2ea55be13"). InnerVolumeSpecName "kube-api-access-46ch7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:23:41 crc kubenswrapper[4913]: I1001 13:23:41.601051 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e4192f-512f-4d3f-8730-51c2ea55be13-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:41 crc kubenswrapper[4913]: I1001 13:23:41.601085 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46ch7\" (UniqueName: \"kubernetes.io/projected/d6e4192f-512f-4d3f-8730-51c2ea55be13-kube-api-access-46ch7\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:41 crc kubenswrapper[4913]: I1001 13:23:41.993794 4913 generic.go:334] "Generic (PLEG): container finished" podID="d6e4192f-512f-4d3f-8730-51c2ea55be13" containerID="d3c70f7cee5b6b727e95a2c30f8a0b98103a52630476c83b5f43d27516ea97cc" exitCode=0 Oct 01 13:23:41 crc kubenswrapper[4913]: I1001 13:23:41.993892 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5nxw" event={"ID":"d6e4192f-512f-4d3f-8730-51c2ea55be13","Type":"ContainerDied","Data":"d3c70f7cee5b6b727e95a2c30f8a0b98103a52630476c83b5f43d27516ea97cc"} Oct 01 13:23:41 crc kubenswrapper[4913]: I1001 13:23:41.993908 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5nxw" Oct 01 13:23:41 crc kubenswrapper[4913]: I1001 13:23:41.993934 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5nxw" event={"ID":"d6e4192f-512f-4d3f-8730-51c2ea55be13","Type":"ContainerDied","Data":"4b419531136aa081faf21b0b08aae2741e6d25f1d5fd922aec574021e633bf84"} Oct 01 13:23:41 crc kubenswrapper[4913]: I1001 13:23:41.993970 4913 scope.go:117] "RemoveContainer" containerID="d3c70f7cee5b6b727e95a2c30f8a0b98103a52630476c83b5f43d27516ea97cc" Oct 01 13:23:42 crc kubenswrapper[4913]: I1001 13:23:42.020853 4913 scope.go:117] "RemoveContainer" containerID="5c5b04594eb60cd280405176e2d120dbcf3b9d48567969cd9541702b2609daa4" Oct 01 13:23:42 crc kubenswrapper[4913]: I1001 13:23:42.041164 4913 scope.go:117] "RemoveContainer" containerID="ddcd7badb1923d5a172bdbc7c84ab0bd765522ccec9194ffeb9db5fd96d8f9c7" Oct 01 13:23:42 crc kubenswrapper[4913]: I1001 13:23:42.083479 4913 scope.go:117] "RemoveContainer" containerID="d3c70f7cee5b6b727e95a2c30f8a0b98103a52630476c83b5f43d27516ea97cc" Oct 01 13:23:42 crc kubenswrapper[4913]: E1001 13:23:42.083944 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3c70f7cee5b6b727e95a2c30f8a0b98103a52630476c83b5f43d27516ea97cc\": container with ID starting with d3c70f7cee5b6b727e95a2c30f8a0b98103a52630476c83b5f43d27516ea97cc not found: ID does not exist" containerID="d3c70f7cee5b6b727e95a2c30f8a0b98103a52630476c83b5f43d27516ea97cc" Oct 01 13:23:42 crc kubenswrapper[4913]: I1001 13:23:42.083984 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3c70f7cee5b6b727e95a2c30f8a0b98103a52630476c83b5f43d27516ea97cc"} err="failed to get container status \"d3c70f7cee5b6b727e95a2c30f8a0b98103a52630476c83b5f43d27516ea97cc\": rpc error: code = NotFound desc = could not find container \"d3c70f7cee5b6b727e95a2c30f8a0b98103a52630476c83b5f43d27516ea97cc\": container with ID starting with d3c70f7cee5b6b727e95a2c30f8a0b98103a52630476c83b5f43d27516ea97cc not found: ID does not exist" Oct 01 13:23:42 crc kubenswrapper[4913]: I1001 13:23:42.084013 4913 scope.go:117] "RemoveContainer" containerID="5c5b04594eb60cd280405176e2d120dbcf3b9d48567969cd9541702b2609daa4" Oct 01 13:23:42 crc kubenswrapper[4913]: E1001 13:23:42.084395 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c5b04594eb60cd280405176e2d120dbcf3b9d48567969cd9541702b2609daa4\": container with ID starting with 5c5b04594eb60cd280405176e2d120dbcf3b9d48567969cd9541702b2609daa4 not found: ID does not exist" containerID="5c5b04594eb60cd280405176e2d120dbcf3b9d48567969cd9541702b2609daa4" Oct 01 13:23:42 crc kubenswrapper[4913]: I1001 13:23:42.084422 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c5b04594eb60cd280405176e2d120dbcf3b9d48567969cd9541702b2609daa4"} err="failed to get container status \"5c5b04594eb60cd280405176e2d120dbcf3b9d48567969cd9541702b2609daa4\": rpc error: code = NotFound desc = could not find container \"5c5b04594eb60cd280405176e2d120dbcf3b9d48567969cd9541702b2609daa4\": container with ID starting with 5c5b04594eb60cd280405176e2d120dbcf3b9d48567969cd9541702b2609daa4 not found: ID does not exist" Oct 01 13:23:42 crc kubenswrapper[4913]: I1001 13:23:42.084439 4913 scope.go:117] "RemoveContainer" containerID="ddcd7badb1923d5a172bdbc7c84ab0bd765522ccec9194ffeb9db5fd96d8f9c7" Oct 01 13:23:42 crc kubenswrapper[4913]: E1001 13:23:42.084782 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddcd7badb1923d5a172bdbc7c84ab0bd765522ccec9194ffeb9db5fd96d8f9c7\": container with ID starting with ddcd7badb1923d5a172bdbc7c84ab0bd765522ccec9194ffeb9db5fd96d8f9c7 not found: ID does not exist" containerID="ddcd7badb1923d5a172bdbc7c84ab0bd765522ccec9194ffeb9db5fd96d8f9c7" Oct 01 13:23:42 crc kubenswrapper[4913]: I1001 13:23:42.084827 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddcd7badb1923d5a172bdbc7c84ab0bd765522ccec9194ffeb9db5fd96d8f9c7"} err="failed to get container status \"ddcd7badb1923d5a172bdbc7c84ab0bd765522ccec9194ffeb9db5fd96d8f9c7\": rpc error: code = NotFound desc = could not find container \"ddcd7badb1923d5a172bdbc7c84ab0bd765522ccec9194ffeb9db5fd96d8f9c7\": container with ID starting with ddcd7badb1923d5a172bdbc7c84ab0bd765522ccec9194ffeb9db5fd96d8f9c7 not found: ID does not exist" Oct 01 13:23:42 crc kubenswrapper[4913]: I1001 13:23:42.160390 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e4192f-512f-4d3f-8730-51c2ea55be13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6e4192f-512f-4d3f-8730-51c2ea55be13" (UID: "d6e4192f-512f-4d3f-8730-51c2ea55be13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:23:42 crc kubenswrapper[4913]: I1001 13:23:42.216164 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e4192f-512f-4d3f-8730-51c2ea55be13-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:42 crc kubenswrapper[4913]: I1001 13:23:42.345209 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p5nxw"] Oct 01 13:23:42 crc kubenswrapper[4913]: I1001 13:23:42.356372 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p5nxw"] Oct 01 13:23:42 crc kubenswrapper[4913]: I1001 13:23:42.818903 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e4192f-512f-4d3f-8730-51c2ea55be13" path="/var/lib/kubelet/pods/d6e4192f-512f-4d3f-8730-51c2ea55be13/volumes" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.316905 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xhnmx"] Oct 01 13:24:39 crc kubenswrapper[4913]: E1001 13:24:39.317826 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e4192f-512f-4d3f-8730-51c2ea55be13" containerName="extract-utilities" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.317842 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e4192f-512f-4d3f-8730-51c2ea55be13" containerName="extract-utilities" Oct 01 13:24:39 crc kubenswrapper[4913]: E1001 13:24:39.317896 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e4192f-512f-4d3f-8730-51c2ea55be13" containerName="extract-content" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.317903 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e4192f-512f-4d3f-8730-51c2ea55be13" containerName="extract-content" Oct 01 13:24:39 crc kubenswrapper[4913]: E1001 13:24:39.317919 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e4192f-512f-4d3f-8730-51c2ea55be13" containerName="registry-server" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.317925 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e4192f-512f-4d3f-8730-51c2ea55be13" containerName="registry-server" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.318155 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e4192f-512f-4d3f-8730-51c2ea55be13" containerName="registry-server" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.319583 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.327274 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xhnmx"] Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.501969 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njq7m\" (UniqueName: \"kubernetes.io/projected/9b7459f5-961b-4fa2-90bd-721ab5a48531-kube-api-access-njq7m\") pod \"redhat-operators-xhnmx\" (UID: \"9b7459f5-961b-4fa2-90bd-721ab5a48531\") " pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.502079 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7459f5-961b-4fa2-90bd-721ab5a48531-catalog-content\") pod \"redhat-operators-xhnmx\" (UID: \"9b7459f5-961b-4fa2-90bd-721ab5a48531\") " pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.502200 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7459f5-961b-4fa2-90bd-721ab5a48531-utilities\") pod \"redhat-operators-xhnmx\" (UID: \"9b7459f5-961b-4fa2-90bd-721ab5a48531\") " pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.603589 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7459f5-961b-4fa2-90bd-721ab5a48531-catalog-content\") pod \"redhat-operators-xhnmx\" (UID: \"9b7459f5-961b-4fa2-90bd-721ab5a48531\") " pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.603702 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7459f5-961b-4fa2-90bd-721ab5a48531-utilities\") pod \"redhat-operators-xhnmx\" (UID: \"9b7459f5-961b-4fa2-90bd-721ab5a48531\") " pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.603792 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njq7m\" (UniqueName: \"kubernetes.io/projected/9b7459f5-961b-4fa2-90bd-721ab5a48531-kube-api-access-njq7m\") pod \"redhat-operators-xhnmx\" (UID: \"9b7459f5-961b-4fa2-90bd-721ab5a48531\") " pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.604362 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7459f5-961b-4fa2-90bd-721ab5a48531-catalog-content\") pod \"redhat-operators-xhnmx\" (UID: \"9b7459f5-961b-4fa2-90bd-721ab5a48531\") " pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.604690 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7459f5-961b-4fa2-90bd-721ab5a48531-utilities\") pod \"redhat-operators-xhnmx\" (UID: \"9b7459f5-961b-4fa2-90bd-721ab5a48531\") " pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.627552 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njq7m\" (UniqueName: \"kubernetes.io/projected/9b7459f5-961b-4fa2-90bd-721ab5a48531-kube-api-access-njq7m\") pod \"redhat-operators-xhnmx\" (UID: \"9b7459f5-961b-4fa2-90bd-721ab5a48531\") " pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:39 crc kubenswrapper[4913]: I1001 13:24:39.639985 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:40 crc kubenswrapper[4913]: I1001 13:24:40.083804 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:24:40 crc kubenswrapper[4913]: I1001 13:24:40.084166 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:24:40 crc kubenswrapper[4913]: I1001 13:24:40.149724 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xhnmx"] Oct 01 13:24:40 crc kubenswrapper[4913]: I1001 13:24:40.503506 4913 generic.go:334] "Generic (PLEG): container finished" podID="9b7459f5-961b-4fa2-90bd-721ab5a48531" containerID="c93bdfd6030e21fc32fe99fbb5e31a5e4b006923f1f809eb8ebde53084404dcb" exitCode=0 Oct 01 13:24:40 crc kubenswrapper[4913]: I1001 13:24:40.503548 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhnmx" event={"ID":"9b7459f5-961b-4fa2-90bd-721ab5a48531","Type":"ContainerDied","Data":"c93bdfd6030e21fc32fe99fbb5e31a5e4b006923f1f809eb8ebde53084404dcb"} Oct 01 13:24:40 crc kubenswrapper[4913]: I1001 13:24:40.503574 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhnmx" event={"ID":"9b7459f5-961b-4fa2-90bd-721ab5a48531","Type":"ContainerStarted","Data":"335008904ed0c7234cecbc389de2024f25db1232ed4b964ed9a84fa17f907461"} Oct 01 13:24:41 crc kubenswrapper[4913]: I1001 13:24:41.521762 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhnmx" event={"ID":"9b7459f5-961b-4fa2-90bd-721ab5a48531","Type":"ContainerStarted","Data":"1a307649f750db0cf66fbbde29ca041f2bf4daeeaf88da9788f841defd397e44"} Oct 01 13:24:42 crc kubenswrapper[4913]: I1001 13:24:42.533864 4913 generic.go:334] "Generic (PLEG): container finished" podID="9b7459f5-961b-4fa2-90bd-721ab5a48531" containerID="1a307649f750db0cf66fbbde29ca041f2bf4daeeaf88da9788f841defd397e44" exitCode=0 Oct 01 13:24:42 crc kubenswrapper[4913]: I1001 13:24:42.533930 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhnmx" event={"ID":"9b7459f5-961b-4fa2-90bd-721ab5a48531","Type":"ContainerDied","Data":"1a307649f750db0cf66fbbde29ca041f2bf4daeeaf88da9788f841defd397e44"} Oct 01 13:24:43 crc kubenswrapper[4913]: I1001 13:24:43.544516 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhnmx" event={"ID":"9b7459f5-961b-4fa2-90bd-721ab5a48531","Type":"ContainerStarted","Data":"26f3fca955f77ad09cfa2eb4544990088efabea8e1288e14fa0c4ef247b2454f"} Oct 01 13:24:43 crc kubenswrapper[4913]: I1001 13:24:43.563375 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xhnmx" podStartSLOduration=1.8635438359999998 podStartE2EDuration="4.563359973s" podCreationTimestamp="2025-10-01 13:24:39 +0000 UTC" firstStartedPulling="2025-10-01 13:24:40.504721255 +0000 UTC m=+2812.408196833" lastFinishedPulling="2025-10-01 13:24:43.204537392 +0000 UTC m=+2815.108012970" observedRunningTime="2025-10-01 13:24:43.56065711 +0000 UTC m=+2815.464132708" watchObservedRunningTime="2025-10-01 13:24:43.563359973 +0000 UTC m=+2815.466835541" Oct 01 13:24:49 crc kubenswrapper[4913]: I1001 13:24:49.659920 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:49 crc kubenswrapper[4913]: I1001 13:24:49.660453 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:49 crc kubenswrapper[4913]: I1001 13:24:49.706700 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:50 crc kubenswrapper[4913]: I1001 13:24:50.652136 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:50 crc kubenswrapper[4913]: I1001 13:24:50.698789 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xhnmx"] Oct 01 13:24:52 crc kubenswrapper[4913]: I1001 13:24:52.620659 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xhnmx" podUID="9b7459f5-961b-4fa2-90bd-721ab5a48531" containerName="registry-server" containerID="cri-o://26f3fca955f77ad09cfa2eb4544990088efabea8e1288e14fa0c4ef247b2454f" gracePeriod=2 Oct 01 13:24:54 crc kubenswrapper[4913]: I1001 13:24:54.647566 4913 generic.go:334] "Generic (PLEG): container finished" podID="9b7459f5-961b-4fa2-90bd-721ab5a48531" containerID="26f3fca955f77ad09cfa2eb4544990088efabea8e1288e14fa0c4ef247b2454f" exitCode=0 Oct 01 13:24:54 crc kubenswrapper[4913]: I1001 13:24:54.647673 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhnmx" event={"ID":"9b7459f5-961b-4fa2-90bd-721ab5a48531","Type":"ContainerDied","Data":"26f3fca955f77ad09cfa2eb4544990088efabea8e1288e14fa0c4ef247b2454f"} Oct 01 13:24:54 crc kubenswrapper[4913]: I1001 13:24:54.912398 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:54 crc kubenswrapper[4913]: I1001 13:24:54.919958 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7459f5-961b-4fa2-90bd-721ab5a48531-catalog-content\") pod \"9b7459f5-961b-4fa2-90bd-721ab5a48531\" (UID: \"9b7459f5-961b-4fa2-90bd-721ab5a48531\") " Oct 01 13:24:54 crc kubenswrapper[4913]: I1001 13:24:54.920065 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njq7m\" (UniqueName: \"kubernetes.io/projected/9b7459f5-961b-4fa2-90bd-721ab5a48531-kube-api-access-njq7m\") pod \"9b7459f5-961b-4fa2-90bd-721ab5a48531\" (UID: \"9b7459f5-961b-4fa2-90bd-721ab5a48531\") " Oct 01 13:24:54 crc kubenswrapper[4913]: I1001 13:24:54.920114 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7459f5-961b-4fa2-90bd-721ab5a48531-utilities\") pod \"9b7459f5-961b-4fa2-90bd-721ab5a48531\" (UID: \"9b7459f5-961b-4fa2-90bd-721ab5a48531\") " Oct 01 13:24:54 crc kubenswrapper[4913]: I1001 13:24:54.921706 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7459f5-961b-4fa2-90bd-721ab5a48531-utilities" (OuterVolumeSpecName: "utilities") pod "9b7459f5-961b-4fa2-90bd-721ab5a48531" (UID: "9b7459f5-961b-4fa2-90bd-721ab5a48531"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:24:54 crc kubenswrapper[4913]: I1001 13:24:54.933143 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7459f5-961b-4fa2-90bd-721ab5a48531-kube-api-access-njq7m" (OuterVolumeSpecName: "kube-api-access-njq7m") pod "9b7459f5-961b-4fa2-90bd-721ab5a48531" (UID: "9b7459f5-961b-4fa2-90bd-721ab5a48531"). InnerVolumeSpecName "kube-api-access-njq7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:55 crc kubenswrapper[4913]: I1001 13:24:55.023356 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njq7m\" (UniqueName: \"kubernetes.io/projected/9b7459f5-961b-4fa2-90bd-721ab5a48531-kube-api-access-njq7m\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:55 crc kubenswrapper[4913]: I1001 13:24:55.023389 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7459f5-961b-4fa2-90bd-721ab5a48531-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:55 crc kubenswrapper[4913]: I1001 13:24:55.032212 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7459f5-961b-4fa2-90bd-721ab5a48531-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b7459f5-961b-4fa2-90bd-721ab5a48531" (UID: "9b7459f5-961b-4fa2-90bd-721ab5a48531"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:24:55 crc kubenswrapper[4913]: I1001 13:24:55.125743 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7459f5-961b-4fa2-90bd-721ab5a48531-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:55 crc kubenswrapper[4913]: I1001 13:24:55.657798 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhnmx" event={"ID":"9b7459f5-961b-4fa2-90bd-721ab5a48531","Type":"ContainerDied","Data":"335008904ed0c7234cecbc389de2024f25db1232ed4b964ed9a84fa17f907461"} Oct 01 13:24:55 crc kubenswrapper[4913]: I1001 13:24:55.657861 4913 scope.go:117] "RemoveContainer" containerID="26f3fca955f77ad09cfa2eb4544990088efabea8e1288e14fa0c4ef247b2454f" Oct 01 13:24:55 crc kubenswrapper[4913]: I1001 13:24:55.657893 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhnmx" Oct 01 13:24:55 crc kubenswrapper[4913]: I1001 13:24:55.677636 4913 scope.go:117] "RemoveContainer" containerID="1a307649f750db0cf66fbbde29ca041f2bf4daeeaf88da9788f841defd397e44" Oct 01 13:24:55 crc kubenswrapper[4913]: I1001 13:24:55.701241 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xhnmx"] Oct 01 13:24:55 crc kubenswrapper[4913]: I1001 13:24:55.710525 4913 scope.go:117] "RemoveContainer" containerID="c93bdfd6030e21fc32fe99fbb5e31a5e4b006923f1f809eb8ebde53084404dcb" Oct 01 13:24:55 crc kubenswrapper[4913]: I1001 13:24:55.713401 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xhnmx"] Oct 01 13:24:56 crc kubenswrapper[4913]: I1001 13:24:56.832292 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7459f5-961b-4fa2-90bd-721ab5a48531" path="/var/lib/kubelet/pods/9b7459f5-961b-4fa2-90bd-721ab5a48531/volumes" Oct 01 13:25:09 crc kubenswrapper[4913]: I1001 13:25:09.808142 4913 generic.go:334] "Generic (PLEG): container finished" podID="5248ec5f-6231-40a3-be9d-815bdf5ec259" containerID="e599f071f4931b15fbb327f1a3b2ebc822434f65c67d3eb0730958802349bdf8" exitCode=0 Oct 01 13:25:09 crc kubenswrapper[4913]: I1001 13:25:09.808246 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" event={"ID":"5248ec5f-6231-40a3-be9d-815bdf5ec259","Type":"ContainerDied","Data":"e599f071f4931b15fbb327f1a3b2ebc822434f65c67d3eb0730958802349bdf8"} Oct 01 13:25:10 crc kubenswrapper[4913]: I1001 13:25:10.083485 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:25:10 crc kubenswrapper[4913]: I1001 13:25:10.083841 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.225638 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.240845 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-ssh-key\") pod \"5248ec5f-6231-40a3-be9d-815bdf5ec259\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.240920 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-ceph\") pod \"5248ec5f-6231-40a3-be9d-815bdf5ec259\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.240953 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-libvirt-secret-0\") pod \"5248ec5f-6231-40a3-be9d-815bdf5ec259\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.241043 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-libvirt-combined-ca-bundle\") pod \"5248ec5f-6231-40a3-be9d-815bdf5ec259\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.241147 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-inventory\") pod \"5248ec5f-6231-40a3-be9d-815bdf5ec259\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.241209 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzpc6\" (UniqueName: \"kubernetes.io/projected/5248ec5f-6231-40a3-be9d-815bdf5ec259-kube-api-access-xzpc6\") pod \"5248ec5f-6231-40a3-be9d-815bdf5ec259\" (UID: \"5248ec5f-6231-40a3-be9d-815bdf5ec259\") " Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.248332 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-ceph" (OuterVolumeSpecName: "ceph") pod "5248ec5f-6231-40a3-be9d-815bdf5ec259" (UID: "5248ec5f-6231-40a3-be9d-815bdf5ec259"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.249606 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5248ec5f-6231-40a3-be9d-815bdf5ec259" (UID: "5248ec5f-6231-40a3-be9d-815bdf5ec259"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.250361 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5248ec5f-6231-40a3-be9d-815bdf5ec259-kube-api-access-xzpc6" (OuterVolumeSpecName: "kube-api-access-xzpc6") pod "5248ec5f-6231-40a3-be9d-815bdf5ec259" (UID: "5248ec5f-6231-40a3-be9d-815bdf5ec259"). InnerVolumeSpecName "kube-api-access-xzpc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.268776 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-inventory" (OuterVolumeSpecName: "inventory") pod "5248ec5f-6231-40a3-be9d-815bdf5ec259" (UID: "5248ec5f-6231-40a3-be9d-815bdf5ec259"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.278711 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "5248ec5f-6231-40a3-be9d-815bdf5ec259" (UID: "5248ec5f-6231-40a3-be9d-815bdf5ec259"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.297521 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5248ec5f-6231-40a3-be9d-815bdf5ec259" (UID: "5248ec5f-6231-40a3-be9d-815bdf5ec259"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.342605 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzpc6\" (UniqueName: \"kubernetes.io/projected/5248ec5f-6231-40a3-be9d-815bdf5ec259-kube-api-access-xzpc6\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.342680 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.342691 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.342702 4913 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.342712 4913 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.342720 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5248ec5f-6231-40a3-be9d-815bdf5ec259-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.849039 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" event={"ID":"5248ec5f-6231-40a3-be9d-815bdf5ec259","Type":"ContainerDied","Data":"69126c6b3873c6d3237a89a0d6b13115a8a536f97e35f964032eecb646461783"} Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.849109 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69126c6b3873c6d3237a89a0d6b13115a8a536f97e35f964032eecb646461783" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.849179 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znc8m" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.935468 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd"] Oct 01 13:25:11 crc kubenswrapper[4913]: E1001 13:25:11.935883 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5248ec5f-6231-40a3-be9d-815bdf5ec259" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.935904 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="5248ec5f-6231-40a3-be9d-815bdf5ec259" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 13:25:11 crc kubenswrapper[4913]: E1001 13:25:11.935946 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7459f5-961b-4fa2-90bd-721ab5a48531" containerName="registry-server" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.935953 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7459f5-961b-4fa2-90bd-721ab5a48531" containerName="registry-server" Oct 01 13:25:11 crc kubenswrapper[4913]: E1001 13:25:11.935965 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7459f5-961b-4fa2-90bd-721ab5a48531" containerName="extract-content" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.935972 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7459f5-961b-4fa2-90bd-721ab5a48531" containerName="extract-content" Oct 01 13:25:11 crc kubenswrapper[4913]: E1001 13:25:11.935984 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7459f5-961b-4fa2-90bd-721ab5a48531" containerName="extract-utilities" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.935991 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7459f5-961b-4fa2-90bd-721ab5a48531" containerName="extract-utilities" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.936185 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="5248ec5f-6231-40a3-be9d-815bdf5ec259" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.936212 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7459f5-961b-4fa2-90bd-721ab5a48531" containerName="registry-server" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.936860 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.941643 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.941748 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.943753 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.944464 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.945792 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.945957 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.947195 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd"] Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.948699 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.948861 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 01 13:25:11 crc kubenswrapper[4913]: I1001 13:25:11.949005 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hpnkx" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.054248 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.054517 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4hnc\" (UniqueName: \"kubernetes.io/projected/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-kube-api-access-t4hnc\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.054557 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.054582 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.054600 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.054634 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.054649 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.054669 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.054693 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.054712 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.054732 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.156707 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.157091 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4hnc\" (UniqueName: \"kubernetes.io/projected/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-kube-api-access-t4hnc\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.157144 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.157189 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.157218 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.157287 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.157312 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.157341 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.157379 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.157408 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.157443 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.158512 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.158604 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.161356 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.161446 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.162018 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.162503 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.162673 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.162908 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.163210 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.176038 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.177971 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4hnc\" (UniqueName: \"kubernetes.io/projected/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-kube-api-access-t4hnc\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.251902 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.755329 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd"] Oct 01 13:25:12 crc kubenswrapper[4913]: I1001 13:25:12.857572 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" event={"ID":"669a8e92-9f6b-4ae7-9647-10c9e96d2de4","Type":"ContainerStarted","Data":"2043e6daab17ca1e3bd275d67630298cf700aeaa5a70975d8e633736b7209c63"} Oct 01 13:25:13 crc kubenswrapper[4913]: I1001 13:25:13.869490 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" event={"ID":"669a8e92-9f6b-4ae7-9647-10c9e96d2de4","Type":"ContainerStarted","Data":"cfbfd863fc5ee0bd975852e985a88f4af3302ffbe95edd226a3af57b88c306c7"} Oct 01 13:25:13 crc kubenswrapper[4913]: I1001 13:25:13.891788 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" podStartSLOduration=2.132331023 podStartE2EDuration="2.891769306s" podCreationTimestamp="2025-10-01 13:25:11 +0000 UTC" firstStartedPulling="2025-10-01 13:25:12.767317033 +0000 UTC m=+2844.670792611" lastFinishedPulling="2025-10-01 13:25:13.526755276 +0000 UTC m=+2845.430230894" observedRunningTime="2025-10-01 13:25:13.889318169 +0000 UTC m=+2845.792793757" watchObservedRunningTime="2025-10-01 13:25:13.891769306 +0000 UTC m=+2845.795244904" Oct 01 13:25:40 crc kubenswrapper[4913]: I1001 13:25:40.084335 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:25:40 crc kubenswrapper[4913]: I1001 13:25:40.086224 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:25:40 crc kubenswrapper[4913]: I1001 13:25:40.086415 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 13:25:40 crc kubenswrapper[4913]: I1001 13:25:40.087236 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:25:40 crc kubenswrapper[4913]: I1001 13:25:40.087377 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" gracePeriod=600 Oct 01 13:25:40 crc kubenswrapper[4913]: E1001 13:25:40.217133 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:25:41 crc kubenswrapper[4913]: I1001 13:25:41.135048 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" exitCode=0 Oct 01 13:25:41 crc kubenswrapper[4913]: I1001 13:25:41.135093 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6"} Oct 01 13:25:41 crc kubenswrapper[4913]: I1001 13:25:41.135171 4913 scope.go:117] "RemoveContainer" containerID="1b76ae050c378f5829f6b759e6437f37d875093ba22902111d9211cf256e9b0f" Oct 01 13:25:41 crc kubenswrapper[4913]: I1001 13:25:41.135805 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:25:41 crc kubenswrapper[4913]: E1001 13:25:41.136145 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:25:53 crc kubenswrapper[4913]: I1001 13:25:53.806656 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:25:53 crc kubenswrapper[4913]: E1001 13:25:53.807589 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:26:07 crc kubenswrapper[4913]: I1001 13:26:07.807240 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:26:07 crc kubenswrapper[4913]: E1001 13:26:07.808060 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:26:22 crc kubenswrapper[4913]: I1001 13:26:22.806629 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:26:22 crc kubenswrapper[4913]: E1001 13:26:22.807404 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:26:34 crc kubenswrapper[4913]: I1001 13:26:34.807394 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:26:34 crc kubenswrapper[4913]: E1001 13:26:34.808325 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:26:45 crc kubenswrapper[4913]: I1001 13:26:45.807094 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:26:45 crc kubenswrapper[4913]: E1001 13:26:45.807868 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:27:00 crc kubenswrapper[4913]: I1001 13:27:00.807354 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:27:00 crc kubenswrapper[4913]: E1001 13:27:00.808597 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:27:15 crc kubenswrapper[4913]: I1001 13:27:15.806659 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:27:15 crc kubenswrapper[4913]: E1001 13:27:15.807459 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:27:29 crc kubenswrapper[4913]: I1001 13:27:29.807588 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:27:29 crc kubenswrapper[4913]: E1001 13:27:29.808691 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:27:43 crc kubenswrapper[4913]: I1001 13:27:43.806974 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:27:43 crc kubenswrapper[4913]: E1001 13:27:43.807939 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:27:58 crc kubenswrapper[4913]: I1001 13:27:58.826125 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:27:58 crc kubenswrapper[4913]: E1001 13:27:58.827071 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:28:12 crc kubenswrapper[4913]: I1001 13:28:12.807384 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:28:12 crc kubenswrapper[4913]: E1001 13:28:12.808021 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:28:27 crc kubenswrapper[4913]: I1001 13:28:27.807031 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:28:27 crc kubenswrapper[4913]: E1001 13:28:27.807784 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:28:39 crc kubenswrapper[4913]: I1001 13:28:39.807163 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:28:39 crc kubenswrapper[4913]: E1001 13:28:39.807887 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:28:52 crc kubenswrapper[4913]: I1001 13:28:52.806224 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:28:52 crc kubenswrapper[4913]: E1001 13:28:52.806987 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:29:06 crc kubenswrapper[4913]: I1001 13:29:06.807448 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:29:06 crc kubenswrapper[4913]: E1001 13:29:06.808496 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:29:17 crc kubenswrapper[4913]: I1001 13:29:17.807497 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:29:17 crc kubenswrapper[4913]: E1001 13:29:17.808700 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:29:32 crc kubenswrapper[4913]: I1001 13:29:32.806641 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:29:32 crc kubenswrapper[4913]: E1001 13:29:32.807649 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:29:43 crc kubenswrapper[4913]: I1001 13:29:43.807781 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:29:43 crc kubenswrapper[4913]: E1001 13:29:43.808828 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:29:53 crc kubenswrapper[4913]: I1001 13:29:53.503022 4913 generic.go:334] "Generic (PLEG): container finished" podID="669a8e92-9f6b-4ae7-9647-10c9e96d2de4" containerID="cfbfd863fc5ee0bd975852e985a88f4af3302ffbe95edd226a3af57b88c306c7" exitCode=0 Oct 01 13:29:53 crc kubenswrapper[4913]: I1001 13:29:53.503101 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" event={"ID":"669a8e92-9f6b-4ae7-9647-10c9e96d2de4","Type":"ContainerDied","Data":"cfbfd863fc5ee0bd975852e985a88f4af3302ffbe95edd226a3af57b88c306c7"} Oct 01 13:29:54 crc kubenswrapper[4913]: I1001 13:29:54.962180 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.066774 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-cell1-compute-config-0\") pod \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.067129 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4hnc\" (UniqueName: \"kubernetes.io/projected/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-kube-api-access-t4hnc\") pod \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.067173 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ceph\") pod \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.067204 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-cell1-compute-config-1\") pod \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.067287 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-migration-ssh-key-1\") pod \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.067321 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-extra-config-0\") pod \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.067356 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-custom-ceph-combined-ca-bundle\") pod \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.067390 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-inventory\") pod \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.067438 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-migration-ssh-key-0\") pod \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.067475 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ceph-nova-0\") pod \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.067587 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ssh-key\") pod \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\" (UID: \"669a8e92-9f6b-4ae7-9647-10c9e96d2de4\") " Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.072444 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ceph" (OuterVolumeSpecName: "ceph") pod "669a8e92-9f6b-4ae7-9647-10c9e96d2de4" (UID: "669a8e92-9f6b-4ae7-9647-10c9e96d2de4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.073045 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "669a8e92-9f6b-4ae7-9647-10c9e96d2de4" (UID: "669a8e92-9f6b-4ae7-9647-10c9e96d2de4"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.073879 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-kube-api-access-t4hnc" (OuterVolumeSpecName: "kube-api-access-t4hnc") pod "669a8e92-9f6b-4ae7-9647-10c9e96d2de4" (UID: "669a8e92-9f6b-4ae7-9647-10c9e96d2de4"). InnerVolumeSpecName "kube-api-access-t4hnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.092152 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "669a8e92-9f6b-4ae7-9647-10c9e96d2de4" (UID: "669a8e92-9f6b-4ae7-9647-10c9e96d2de4"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.092817 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-inventory" (OuterVolumeSpecName: "inventory") pod "669a8e92-9f6b-4ae7-9647-10c9e96d2de4" (UID: "669a8e92-9f6b-4ae7-9647-10c9e96d2de4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.094001 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "669a8e92-9f6b-4ae7-9647-10c9e96d2de4" (UID: "669a8e92-9f6b-4ae7-9647-10c9e96d2de4"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.096853 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "669a8e92-9f6b-4ae7-9647-10c9e96d2de4" (UID: "669a8e92-9f6b-4ae7-9647-10c9e96d2de4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.096872 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "669a8e92-9f6b-4ae7-9647-10c9e96d2de4" (UID: "669a8e92-9f6b-4ae7-9647-10c9e96d2de4"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.097234 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "669a8e92-9f6b-4ae7-9647-10c9e96d2de4" (UID: "669a8e92-9f6b-4ae7-9647-10c9e96d2de4"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.101369 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "669a8e92-9f6b-4ae7-9647-10c9e96d2de4" (UID: "669a8e92-9f6b-4ae7-9647-10c9e96d2de4"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.102429 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "669a8e92-9f6b-4ae7-9647-10c9e96d2de4" (UID: "669a8e92-9f6b-4ae7-9647-10c9e96d2de4"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.170623 4913 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.170662 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4hnc\" (UniqueName: \"kubernetes.io/projected/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-kube-api-access-t4hnc\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.170677 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.170688 4913 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.170699 4913 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.170711 4913 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.170722 4913 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.170738 4913 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.170750 4913 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.170760 4913 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.170767 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/669a8e92-9f6b-4ae7-9647-10c9e96d2de4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.522127 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" event={"ID":"669a8e92-9f6b-4ae7-9647-10c9e96d2de4","Type":"ContainerDied","Data":"2043e6daab17ca1e3bd275d67630298cf700aeaa5a70975d8e633736b7209c63"} Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.522172 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2043e6daab17ca1e3bd275d67630298cf700aeaa5a70975d8e633736b7209c63" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.522232 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd" Oct 01 13:29:55 crc kubenswrapper[4913]: I1001 13:29:55.807128 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:29:55 crc kubenswrapper[4913]: E1001 13:29:55.807527 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.155714 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb"] Oct 01 13:30:00 crc kubenswrapper[4913]: E1001 13:30:00.157537 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="669a8e92-9f6b-4ae7-9647-10c9e96d2de4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.157635 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="669a8e92-9f6b-4ae7-9647-10c9e96d2de4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.157887 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="669a8e92-9f6b-4ae7-9647-10c9e96d2de4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.158675 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.161341 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.164352 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb"] Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.164824 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.267160 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjrx9\" (UniqueName: \"kubernetes.io/projected/c9469dd7-5611-48f4-868d-1a36066f43d0-kube-api-access-zjrx9\") pod \"collect-profiles-29322090-r7skb\" (UID: \"c9469dd7-5611-48f4-868d-1a36066f43d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.267236 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9469dd7-5611-48f4-868d-1a36066f43d0-secret-volume\") pod \"collect-profiles-29322090-r7skb\" (UID: \"c9469dd7-5611-48f4-868d-1a36066f43d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.267355 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9469dd7-5611-48f4-868d-1a36066f43d0-config-volume\") pod \"collect-profiles-29322090-r7skb\" (UID: \"c9469dd7-5611-48f4-868d-1a36066f43d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.368563 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9469dd7-5611-48f4-868d-1a36066f43d0-config-volume\") pod \"collect-profiles-29322090-r7skb\" (UID: \"c9469dd7-5611-48f4-868d-1a36066f43d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.368676 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjrx9\" (UniqueName: \"kubernetes.io/projected/c9469dd7-5611-48f4-868d-1a36066f43d0-kube-api-access-zjrx9\") pod \"collect-profiles-29322090-r7skb\" (UID: \"c9469dd7-5611-48f4-868d-1a36066f43d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.368722 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9469dd7-5611-48f4-868d-1a36066f43d0-secret-volume\") pod \"collect-profiles-29322090-r7skb\" (UID: \"c9469dd7-5611-48f4-868d-1a36066f43d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.369854 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9469dd7-5611-48f4-868d-1a36066f43d0-config-volume\") pod \"collect-profiles-29322090-r7skb\" (UID: \"c9469dd7-5611-48f4-868d-1a36066f43d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.377115 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9469dd7-5611-48f4-868d-1a36066f43d0-secret-volume\") pod \"collect-profiles-29322090-r7skb\" (UID: \"c9469dd7-5611-48f4-868d-1a36066f43d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.388183 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjrx9\" (UniqueName: \"kubernetes.io/projected/c9469dd7-5611-48f4-868d-1a36066f43d0-kube-api-access-zjrx9\") pod \"collect-profiles-29322090-r7skb\" (UID: \"c9469dd7-5611-48f4-868d-1a36066f43d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.489809 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" Oct 01 13:30:00 crc kubenswrapper[4913]: I1001 13:30:00.929106 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb"] Oct 01 13:30:01 crc kubenswrapper[4913]: I1001 13:30:01.601721 4913 generic.go:334] "Generic (PLEG): container finished" podID="c9469dd7-5611-48f4-868d-1a36066f43d0" containerID="06be0009edb44b7f12cdcf42d446a82ebf846b5aebe201febcee683d247033e5" exitCode=0 Oct 01 13:30:01 crc kubenswrapper[4913]: I1001 13:30:01.601905 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" event={"ID":"c9469dd7-5611-48f4-868d-1a36066f43d0","Type":"ContainerDied","Data":"06be0009edb44b7f12cdcf42d446a82ebf846b5aebe201febcee683d247033e5"} Oct 01 13:30:01 crc kubenswrapper[4913]: I1001 13:30:01.602093 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" event={"ID":"c9469dd7-5611-48f4-868d-1a36066f43d0","Type":"ContainerStarted","Data":"850ee151043cba017d6c4e8ad214e84c8b0a8c4063287b4b5f32f8b87003cd43"} Oct 01 13:30:02 crc kubenswrapper[4913]: I1001 13:30:02.945573 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" Oct 01 13:30:03 crc kubenswrapper[4913]: I1001 13:30:03.027397 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9469dd7-5611-48f4-868d-1a36066f43d0-secret-volume\") pod \"c9469dd7-5611-48f4-868d-1a36066f43d0\" (UID: \"c9469dd7-5611-48f4-868d-1a36066f43d0\") " Oct 01 13:30:03 crc kubenswrapper[4913]: I1001 13:30:03.027644 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjrx9\" (UniqueName: \"kubernetes.io/projected/c9469dd7-5611-48f4-868d-1a36066f43d0-kube-api-access-zjrx9\") pod \"c9469dd7-5611-48f4-868d-1a36066f43d0\" (UID: \"c9469dd7-5611-48f4-868d-1a36066f43d0\") " Oct 01 13:30:03 crc kubenswrapper[4913]: I1001 13:30:03.027744 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9469dd7-5611-48f4-868d-1a36066f43d0-config-volume\") pod \"c9469dd7-5611-48f4-868d-1a36066f43d0\" (UID: \"c9469dd7-5611-48f4-868d-1a36066f43d0\") " Oct 01 13:30:03 crc kubenswrapper[4913]: I1001 13:30:03.028479 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9469dd7-5611-48f4-868d-1a36066f43d0-config-volume" (OuterVolumeSpecName: "config-volume") pod "c9469dd7-5611-48f4-868d-1a36066f43d0" (UID: "c9469dd7-5611-48f4-868d-1a36066f43d0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:30:03 crc kubenswrapper[4913]: I1001 13:30:03.032934 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9469dd7-5611-48f4-868d-1a36066f43d0-kube-api-access-zjrx9" (OuterVolumeSpecName: "kube-api-access-zjrx9") pod "c9469dd7-5611-48f4-868d-1a36066f43d0" (UID: "c9469dd7-5611-48f4-868d-1a36066f43d0"). InnerVolumeSpecName "kube-api-access-zjrx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:30:03 crc kubenswrapper[4913]: I1001 13:30:03.033017 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9469dd7-5611-48f4-868d-1a36066f43d0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c9469dd7-5611-48f4-868d-1a36066f43d0" (UID: "c9469dd7-5611-48f4-868d-1a36066f43d0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:03 crc kubenswrapper[4913]: I1001 13:30:03.130181 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjrx9\" (UniqueName: \"kubernetes.io/projected/c9469dd7-5611-48f4-868d-1a36066f43d0-kube-api-access-zjrx9\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:03 crc kubenswrapper[4913]: I1001 13:30:03.130235 4913 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9469dd7-5611-48f4-868d-1a36066f43d0-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:03 crc kubenswrapper[4913]: I1001 13:30:03.130254 4913 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9469dd7-5611-48f4-868d-1a36066f43d0-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:03 crc kubenswrapper[4913]: I1001 13:30:03.619362 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" event={"ID":"c9469dd7-5611-48f4-868d-1a36066f43d0","Type":"ContainerDied","Data":"850ee151043cba017d6c4e8ad214e84c8b0a8c4063287b4b5f32f8b87003cd43"} Oct 01 13:30:03 crc kubenswrapper[4913]: I1001 13:30:03.619401 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="850ee151043cba017d6c4e8ad214e84c8b0a8c4063287b4b5f32f8b87003cd43" Oct 01 13:30:03 crc kubenswrapper[4913]: I1001 13:30:03.619419 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb" Oct 01 13:30:04 crc kubenswrapper[4913]: I1001 13:30:04.023254 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r"] Oct 01 13:30:04 crc kubenswrapper[4913]: I1001 13:30:04.031546 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-gbt8r"] Oct 01 13:30:04 crc kubenswrapper[4913]: I1001 13:30:04.818556 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e" path="/var/lib/kubelet/pods/3faa9bbb-74ee-4dd7-9333-a9ef4aac6f7e/volumes" Oct 01 13:30:07 crc kubenswrapper[4913]: I1001 13:30:07.284388 4913 scope.go:117] "RemoveContainer" containerID="280737ff584f454f5e74c72fe99cc660a6076c1416c8bfe3c96f4cd9fa66789b" Oct 01 13:30:08 crc kubenswrapper[4913]: I1001 13:30:08.813068 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:30:08 crc kubenswrapper[4913]: E1001 13:30:08.813679 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:30:09 crc kubenswrapper[4913]: I1001 13:30:09.884442 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 01 13:30:09 crc kubenswrapper[4913]: E1001 13:30:09.884796 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9469dd7-5611-48f4-868d-1a36066f43d0" containerName="collect-profiles" Oct 01 13:30:09 crc kubenswrapper[4913]: I1001 13:30:09.884809 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9469dd7-5611-48f4-868d-1a36066f43d0" containerName="collect-profiles" Oct 01 13:30:09 crc kubenswrapper[4913]: I1001 13:30:09.885000 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9469dd7-5611-48f4-868d-1a36066f43d0" containerName="collect-profiles" Oct 01 13:30:09 crc kubenswrapper[4913]: I1001 13:30:09.886180 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:09 crc kubenswrapper[4913]: I1001 13:30:09.893717 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 13:30:09 crc kubenswrapper[4913]: I1001 13:30:09.915976 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 01 13:30:09 crc kubenswrapper[4913]: I1001 13:30:09.920459 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 01 13:30:09 crc kubenswrapper[4913]: I1001 13:30:09.972579 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 01 13:30:09 crc kubenswrapper[4913]: I1001 13:30:09.974187 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 01 13:30:09 crc kubenswrapper[4913]: I1001 13:30:09.976658 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 01 13:30:09 crc kubenswrapper[4913]: I1001 13:30:09.988580 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052004 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052340 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052366 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052436 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052479 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-sys\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052495 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f8627d36-1d7b-40fa-b011-7a1dacddb61c-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052519 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8627d36-1d7b-40fa-b011-7a1dacddb61c-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052541 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8627d36-1d7b-40fa-b011-7a1dacddb61c-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052704 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-run\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052759 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8627d36-1d7b-40fa-b011-7a1dacddb61c-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052822 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052865 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-dev\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052879 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052917 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.052975 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8627d36-1d7b-40fa-b011-7a1dacddb61c-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.053022 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgbp6\" (UniqueName: \"kubernetes.io/projected/f8627d36-1d7b-40fa-b011-7a1dacddb61c-kube-api-access-pgbp6\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154167 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154211 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-config-data\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154233 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154281 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154304 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154325 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154360 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154380 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gxqq\" (UniqueName: \"kubernetes.io/projected/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-kube-api-access-2gxqq\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154411 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-ceph\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154441 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-lib-modules\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154466 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154462 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154481 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154542 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154563 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154667 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154700 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154718 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-run\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154760 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-sys\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154791 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f8627d36-1d7b-40fa-b011-7a1dacddb61c-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154827 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-sys\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154856 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8627d36-1d7b-40fa-b011-7a1dacddb61c-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154878 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8627d36-1d7b-40fa-b011-7a1dacddb61c-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154900 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154920 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-run\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154936 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154952 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-sys\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.154972 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8627d36-1d7b-40fa-b011-7a1dacddb61c-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.155018 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.155043 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-dev\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.155059 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.155084 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.155116 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8627d36-1d7b-40fa-b011-7a1dacddb61c-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.155137 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-dev\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.155157 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-scripts\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.155176 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.155200 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgbp6\" (UniqueName: \"kubernetes.io/projected/f8627d36-1d7b-40fa-b011-7a1dacddb61c-kube-api-access-pgbp6\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.155777 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.156382 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-run\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.156439 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.156475 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-dev\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.156530 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f8627d36-1d7b-40fa-b011-7a1dacddb61c-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.162222 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8627d36-1d7b-40fa-b011-7a1dacddb61c-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.162388 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8627d36-1d7b-40fa-b011-7a1dacddb61c-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.162513 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8627d36-1d7b-40fa-b011-7a1dacddb61c-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.162584 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8627d36-1d7b-40fa-b011-7a1dacddb61c-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.163034 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f8627d36-1d7b-40fa-b011-7a1dacddb61c-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.175549 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgbp6\" (UniqueName: \"kubernetes.io/projected/f8627d36-1d7b-40fa-b011-7a1dacddb61c-kube-api-access-pgbp6\") pod \"cinder-volume-volume1-0\" (UID: \"f8627d36-1d7b-40fa-b011-7a1dacddb61c\") " pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.221318 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.257348 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.257419 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.257449 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-sys\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.257591 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-dev\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.257635 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-scripts\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.257667 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.257711 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-config-data\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.257738 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.257761 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.257843 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.257870 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gxqq\" (UniqueName: \"kubernetes.io/projected/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-kube-api-access-2gxqq\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.257930 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-ceph\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.257986 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-lib-modules\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.257991 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.258032 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.258060 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.258068 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.258129 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-run\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.258325 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-run\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.258358 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.258743 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-sys\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.258886 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.258963 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.259020 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-dev\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.259045 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-lib-modules\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.259367 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.265840 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-scripts\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.266935 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.275669 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-config-data\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.276988 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-ceph\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.277393 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gxqq\" (UniqueName: \"kubernetes.io/projected/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-kube-api-access-2gxqq\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.288290 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed\") " pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.290740 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.529992 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-tbv5k"] Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.531695 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-tbv5k" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.546109 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-tbv5k"] Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.671112 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs5qk\" (UniqueName: \"kubernetes.io/projected/9154b01b-b6c6-450a-a022-2c0c7f6ccf9b-kube-api-access-vs5qk\") pod \"manila-db-create-tbv5k\" (UID: \"9154b01b-b6c6-450a-a022-2c0c7f6ccf9b\") " pod="openstack/manila-db-create-tbv5k" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.717137 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.719202 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.720852 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tjvjz" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.720941 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.721239 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.721290 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.728005 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.766475 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.769612 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.774719 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.775457 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs5qk\" (UniqueName: \"kubernetes.io/projected/9154b01b-b6c6-450a-a022-2c0c7f6ccf9b-kube-api-access-vs5qk\") pod \"manila-db-create-tbv5k\" (UID: \"9154b01b-b6c6-450a-a022-2c0c7f6ccf9b\") " pod="openstack/manila-db-create-tbv5k" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.776041 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.827537 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs5qk\" (UniqueName: \"kubernetes.io/projected/9154b01b-b6c6-450a-a022-2c0c7f6ccf9b-kube-api-access-vs5qk\") pod \"manila-db-create-tbv5k\" (UID: \"9154b01b-b6c6-450a-a022-2c0c7f6ccf9b\") " pod="openstack/manila-db-create-tbv5k" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.863679 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-tbv5k" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.877040 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.877090 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.877108 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-logs\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.877136 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.877172 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ed64cca4-22fc-4756-863f-8cee18a7f40e-ceph\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.877191 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed64cca4-22fc-4756-863f-8cee18a7f40e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.877242 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.877345 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.877370 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.877388 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm8xt\" (UniqueName: \"kubernetes.io/projected/ed64cca4-22fc-4756-863f-8cee18a7f40e-kube-api-access-pm8xt\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.877417 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed64cca4-22fc-4756-863f-8cee18a7f40e-scripts\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.877434 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed64cca4-22fc-4756-863f-8cee18a7f40e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.877469 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k5cf\" (UniqueName: \"kubernetes.io/projected/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-kube-api-access-8k5cf\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.878245 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.878329 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed64cca4-22fc-4756-863f-8cee18a7f40e-logs\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.878359 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed64cca4-22fc-4756-863f-8cee18a7f40e-config-data\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.878385 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed64cca4-22fc-4756-863f-8cee18a7f40e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.878407 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.881400 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.926884 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.941620 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982251 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982325 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm8xt\" (UniqueName: \"kubernetes.io/projected/ed64cca4-22fc-4756-863f-8cee18a7f40e-kube-api-access-pm8xt\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982358 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed64cca4-22fc-4756-863f-8cee18a7f40e-scripts\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982378 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed64cca4-22fc-4756-863f-8cee18a7f40e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982405 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k5cf\" (UniqueName: \"kubernetes.io/projected/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-kube-api-access-8k5cf\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982427 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982456 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed64cca4-22fc-4756-863f-8cee18a7f40e-logs\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982478 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed64cca4-22fc-4756-863f-8cee18a7f40e-config-data\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982501 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed64cca4-22fc-4756-863f-8cee18a7f40e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982521 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982547 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982561 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982575 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-logs\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982599 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982631 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ed64cca4-22fc-4756-863f-8cee18a7f40e-ceph\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982650 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed64cca4-22fc-4756-863f-8cee18a7f40e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982693 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.982710 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.983040 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.986495 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.986782 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-logs\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.988767 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed64cca4-22fc-4756-863f-8cee18a7f40e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.993700 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed64cca4-22fc-4756-863f-8cee18a7f40e-logs\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:10 crc kubenswrapper[4913]: I1001 13:30:10.997734 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.010207 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ed64cca4-22fc-4756-863f-8cee18a7f40e-ceph\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.011682 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.011843 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed64cca4-22fc-4756-863f-8cee18a7f40e-scripts\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.013630 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed64cca4-22fc-4756-863f-8cee18a7f40e-config-data\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.013729 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.014599 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed64cca4-22fc-4756-863f-8cee18a7f40e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.016947 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.020101 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm8xt\" (UniqueName: \"kubernetes.io/projected/ed64cca4-22fc-4756-863f-8cee18a7f40e-kube-api-access-pm8xt\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.021405 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed64cca4-22fc-4756-863f-8cee18a7f40e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.023059 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.023117 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.023187 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k5cf\" (UniqueName: \"kubernetes.io/projected/e6353b13-0958-4f48-ac1b-0a2aeef50ad8-kube-api-access-8k5cf\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.034014 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ed64cca4-22fc-4756-863f-8cee18a7f40e\") " pod="openstack/glance-default-external-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.054380 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e6353b13-0958-4f48-ac1b-0a2aeef50ad8\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.059403 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.103831 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.105368 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 01 13:30:11 crc kubenswrapper[4913]: W1001 13:30:11.106534 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d0ca0c4_37f7_43e3_8a6f_b9c5c42d02ed.slice/crio-23c954ee330f359678aee2dd4a0d603a1534bc7b962208bd9889fbd17cf29529 WatchSource:0}: Error finding container 23c954ee330f359678aee2dd4a0d603a1534bc7b962208bd9889fbd17cf29529: Status 404 returned error can't find the container with id 23c954ee330f359678aee2dd4a0d603a1534bc7b962208bd9889fbd17cf29529 Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.265958 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-tbv5k"] Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.561073 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:30:11 crc kubenswrapper[4913]: W1001 13:30:11.572815 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6353b13_0958_4f48_ac1b_0a2aeef50ad8.slice/crio-ca3c27986fe7734c54b08aa58bc776ddfdd540ea86638af8cfe6ccec16b6a7b1 WatchSource:0}: Error finding container ca3c27986fe7734c54b08aa58bc776ddfdd540ea86638af8cfe6ccec16b6a7b1: Status 404 returned error can't find the container with id ca3c27986fe7734c54b08aa58bc776ddfdd540ea86638af8cfe6ccec16b6a7b1 Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.681100 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"f8627d36-1d7b-40fa-b011-7a1dacddb61c","Type":"ContainerStarted","Data":"8ee947daf9969bdad878f2626e3c1e995c6d67ccaf9793e966b7233142c809b3"} Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.683434 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-tbv5k" event={"ID":"9154b01b-b6c6-450a-a022-2c0c7f6ccf9b","Type":"ContainerStarted","Data":"1b9b027ce97c9f98ff8003a295a9d2aacf633c3ece67877c1e7b8fadc5ae8f87"} Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.686928 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e6353b13-0958-4f48-ac1b-0a2aeef50ad8","Type":"ContainerStarted","Data":"ca3c27986fe7734c54b08aa58bc776ddfdd540ea86638af8cfe6ccec16b6a7b1"} Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.690167 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed","Type":"ContainerStarted","Data":"23c954ee330f359678aee2dd4a0d603a1534bc7b962208bd9889fbd17cf29529"} Oct 01 13:30:11 crc kubenswrapper[4913]: I1001 13:30:11.701015 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:30:11 crc kubenswrapper[4913]: W1001 13:30:11.712374 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded64cca4_22fc_4756_863f_8cee18a7f40e.slice/crio-a6e698e9338f235d69130a6c3e776e25c751c166c61d6340bc9082175b0ffbb8 WatchSource:0}: Error finding container a6e698e9338f235d69130a6c3e776e25c751c166c61d6340bc9082175b0ffbb8: Status 404 returned error can't find the container with id a6e698e9338f235d69130a6c3e776e25c751c166c61d6340bc9082175b0ffbb8 Oct 01 13:30:12 crc kubenswrapper[4913]: I1001 13:30:12.699097 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ed64cca4-22fc-4756-863f-8cee18a7f40e","Type":"ContainerStarted","Data":"a6e698e9338f235d69130a6c3e776e25c751c166c61d6340bc9082175b0ffbb8"} Oct 01 13:30:12 crc kubenswrapper[4913]: I1001 13:30:12.703087 4913 generic.go:334] "Generic (PLEG): container finished" podID="9154b01b-b6c6-450a-a022-2c0c7f6ccf9b" containerID="a6a4ca061bc171ba308cf95a966fabaa5ec51392e93749ab08d45083c25ef18e" exitCode=0 Oct 01 13:30:12 crc kubenswrapper[4913]: I1001 13:30:12.703147 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-tbv5k" event={"ID":"9154b01b-b6c6-450a-a022-2c0c7f6ccf9b","Type":"ContainerDied","Data":"a6a4ca061bc171ba308cf95a966fabaa5ec51392e93749ab08d45083c25ef18e"} Oct 01 13:30:12 crc kubenswrapper[4913]: I1001 13:30:12.705161 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e6353b13-0958-4f48-ac1b-0a2aeef50ad8","Type":"ContainerStarted","Data":"e0e422cd6a20417d22030d75af72e399f566a5ef0fbadfecda08d3feec1c0c99"} Oct 01 13:30:13 crc kubenswrapper[4913]: I1001 13:30:13.716284 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ed64cca4-22fc-4756-863f-8cee18a7f40e","Type":"ContainerStarted","Data":"d727fed0242e1b1123b1fda799478c91b54014d27a1236bd90dc56633c876aad"} Oct 01 13:30:13 crc kubenswrapper[4913]: I1001 13:30:13.716953 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ed64cca4-22fc-4756-863f-8cee18a7f40e","Type":"ContainerStarted","Data":"e14bda96ee2d9ee471cbb8192cd5b79b44bf4f1a00b645801023307eba1f4441"} Oct 01 13:30:13 crc kubenswrapper[4913]: I1001 13:30:13.721794 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e6353b13-0958-4f48-ac1b-0a2aeef50ad8","Type":"ContainerStarted","Data":"6c12115e0010429aaca4e117b05b4e36f117f1beae5fa4951d9fd9d1750fa181"} Oct 01 13:30:14 crc kubenswrapper[4913]: I1001 13:30:14.742778 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-tbv5k" event={"ID":"9154b01b-b6c6-450a-a022-2c0c7f6ccf9b","Type":"ContainerDied","Data":"1b9b027ce97c9f98ff8003a295a9d2aacf633c3ece67877c1e7b8fadc5ae8f87"} Oct 01 13:30:14 crc kubenswrapper[4913]: I1001 13:30:14.743404 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b9b027ce97c9f98ff8003a295a9d2aacf633c3ece67877c1e7b8fadc5ae8f87" Oct 01 13:30:14 crc kubenswrapper[4913]: I1001 13:30:14.773248 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-tbv5k" Oct 01 13:30:14 crc kubenswrapper[4913]: I1001 13:30:14.784779 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.784751883 podStartE2EDuration="5.784751883s" podCreationTimestamp="2025-10-01 13:30:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:30:14.770380621 +0000 UTC m=+3146.673856219" watchObservedRunningTime="2025-10-01 13:30:14.784751883 +0000 UTC m=+3146.688227471" Oct 01 13:30:14 crc kubenswrapper[4913]: I1001 13:30:14.789204 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.789182278 podStartE2EDuration="5.789182278s" podCreationTimestamp="2025-10-01 13:30:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:30:13.743607212 +0000 UTC m=+3145.647082820" watchObservedRunningTime="2025-10-01 13:30:14.789182278 +0000 UTC m=+3146.692657866" Oct 01 13:30:14 crc kubenswrapper[4913]: I1001 13:30:14.876351 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs5qk\" (UniqueName: \"kubernetes.io/projected/9154b01b-b6c6-450a-a022-2c0c7f6ccf9b-kube-api-access-vs5qk\") pod \"9154b01b-b6c6-450a-a022-2c0c7f6ccf9b\" (UID: \"9154b01b-b6c6-450a-a022-2c0c7f6ccf9b\") " Oct 01 13:30:14 crc kubenswrapper[4913]: I1001 13:30:14.883780 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9154b01b-b6c6-450a-a022-2c0c7f6ccf9b-kube-api-access-vs5qk" (OuterVolumeSpecName: "kube-api-access-vs5qk") pod "9154b01b-b6c6-450a-a022-2c0c7f6ccf9b" (UID: "9154b01b-b6c6-450a-a022-2c0c7f6ccf9b"). InnerVolumeSpecName "kube-api-access-vs5qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:30:14 crc kubenswrapper[4913]: I1001 13:30:14.980032 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs5qk\" (UniqueName: \"kubernetes.io/projected/9154b01b-b6c6-450a-a022-2c0c7f6ccf9b-kube-api-access-vs5qk\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:15 crc kubenswrapper[4913]: I1001 13:30:15.760520 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"f8627d36-1d7b-40fa-b011-7a1dacddb61c","Type":"ContainerStarted","Data":"cf039ce87f304b34ca0988b187211426b54d3d7b33d9025bb5c0e451d499aef4"} Oct 01 13:30:15 crc kubenswrapper[4913]: I1001 13:30:15.762748 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-tbv5k" Oct 01 13:30:15 crc kubenswrapper[4913]: I1001 13:30:15.767590 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed","Type":"ContainerStarted","Data":"e54cf3e31148504cc668f1a82fe228e8cdfd8ddcbc95c559a2b4ad66d33f14d2"} Oct 01 13:30:16 crc kubenswrapper[4913]: I1001 13:30:16.779733 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed","Type":"ContainerStarted","Data":"5137dc482cacefced05da34eddcbffd2a553da87ebec2679848d1fe75fe91078"} Oct 01 13:30:16 crc kubenswrapper[4913]: I1001 13:30:16.781847 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"f8627d36-1d7b-40fa-b011-7a1dacddb61c","Type":"ContainerStarted","Data":"05f31662467bbf7a7358997964f267d489ec6b2dd6e5de687b5d2418e2238162"} Oct 01 13:30:16 crc kubenswrapper[4913]: I1001 13:30:16.824034 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.913073531 podStartE2EDuration="7.824018022s" podCreationTimestamp="2025-10-01 13:30:09 +0000 UTC" firstStartedPulling="2025-10-01 13:30:11.121501753 +0000 UTC m=+3143.024977331" lastFinishedPulling="2025-10-01 13:30:15.032446244 +0000 UTC m=+3146.935921822" observedRunningTime="2025-10-01 13:30:16.821030638 +0000 UTC m=+3148.724506236" watchObservedRunningTime="2025-10-01 13:30:16.824018022 +0000 UTC m=+3148.727493600" Oct 01 13:30:16 crc kubenswrapper[4913]: I1001 13:30:16.850585 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.758008056 podStartE2EDuration="7.850566016s" podCreationTimestamp="2025-10-01 13:30:09 +0000 UTC" firstStartedPulling="2025-10-01 13:30:10.941400046 +0000 UTC m=+3142.844875624" lastFinishedPulling="2025-10-01 13:30:15.033958006 +0000 UTC m=+3146.937433584" observedRunningTime="2025-10-01 13:30:16.845994138 +0000 UTC m=+3148.749469716" watchObservedRunningTime="2025-10-01 13:30:16.850566016 +0000 UTC m=+3148.754041594" Oct 01 13:30:20 crc kubenswrapper[4913]: I1001 13:30:20.222076 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:20 crc kubenswrapper[4913]: I1001 13:30:20.291455 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 01 13:30:20 crc kubenswrapper[4913]: I1001 13:30:20.718350 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 01 13:30:20 crc kubenswrapper[4913]: I1001 13:30:20.723377 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 01 13:30:20 crc kubenswrapper[4913]: I1001 13:30:20.759719 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-1f2e-account-create-t5r92"] Oct 01 13:30:20 crc kubenswrapper[4913]: E1001 13:30:20.760195 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9154b01b-b6c6-450a-a022-2c0c7f6ccf9b" containerName="mariadb-database-create" Oct 01 13:30:20 crc kubenswrapper[4913]: I1001 13:30:20.760219 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="9154b01b-b6c6-450a-a022-2c0c7f6ccf9b" containerName="mariadb-database-create" Oct 01 13:30:20 crc kubenswrapper[4913]: I1001 13:30:20.760461 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="9154b01b-b6c6-450a-a022-2c0c7f6ccf9b" containerName="mariadb-database-create" Oct 01 13:30:20 crc kubenswrapper[4913]: I1001 13:30:20.761199 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1f2e-account-create-t5r92" Oct 01 13:30:20 crc kubenswrapper[4913]: I1001 13:30:20.763379 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 01 13:30:20 crc kubenswrapper[4913]: I1001 13:30:20.776194 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-1f2e-account-create-t5r92"] Oct 01 13:30:20 crc kubenswrapper[4913]: I1001 13:30:20.900456 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc7jw\" (UniqueName: \"kubernetes.io/projected/72b53680-fe1a-44b3-8b5b-e721be53113c-kube-api-access-nc7jw\") pod \"manila-1f2e-account-create-t5r92\" (UID: \"72b53680-fe1a-44b3-8b5b-e721be53113c\") " pod="openstack/manila-1f2e-account-create-t5r92" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.003040 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc7jw\" (UniqueName: \"kubernetes.io/projected/72b53680-fe1a-44b3-8b5b-e721be53113c-kube-api-access-nc7jw\") pod \"manila-1f2e-account-create-t5r92\" (UID: \"72b53680-fe1a-44b3-8b5b-e721be53113c\") " pod="openstack/manila-1f2e-account-create-t5r92" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.048474 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc7jw\" (UniqueName: \"kubernetes.io/projected/72b53680-fe1a-44b3-8b5b-e721be53113c-kube-api-access-nc7jw\") pod \"manila-1f2e-account-create-t5r92\" (UID: \"72b53680-fe1a-44b3-8b5b-e721be53113c\") " pod="openstack/manila-1f2e-account-create-t5r92" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.060730 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.060790 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.092720 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1f2e-account-create-t5r92" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.104085 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.105089 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.219361 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.219820 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.220645 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.221220 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.750943 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-1f2e-account-create-t5r92"] Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.822289 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1f2e-account-create-t5r92" event={"ID":"72b53680-fe1a-44b3-8b5b-e721be53113c","Type":"ContainerStarted","Data":"e10ba38f781e9c89848a29fc03ef3ee4fcff69961f66b5bf64d6483f913e7612"} Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.823332 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.823360 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.823374 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 13:30:21 crc kubenswrapper[4913]: I1001 13:30:21.823386 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 13:30:22 crc kubenswrapper[4913]: I1001 13:30:22.832175 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1f2e-account-create-t5r92" event={"ID":"72b53680-fe1a-44b3-8b5b-e721be53113c","Type":"ContainerStarted","Data":"a01bca6bfdd7d77fcf5688e9f5128d9e7256252038e4637282436651c4d738ef"} Oct 01 13:30:22 crc kubenswrapper[4913]: I1001 13:30:22.855314 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-1f2e-account-create-t5r92" podStartSLOduration=2.855288663 podStartE2EDuration="2.855288663s" podCreationTimestamp="2025-10-01 13:30:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:30:22.848641167 +0000 UTC m=+3154.752116755" watchObservedRunningTime="2025-10-01 13:30:22.855288663 +0000 UTC m=+3154.758764241" Oct 01 13:30:23 crc kubenswrapper[4913]: I1001 13:30:23.806467 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:30:23 crc kubenswrapper[4913]: E1001 13:30:23.807220 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:30:23 crc kubenswrapper[4913]: I1001 13:30:23.841977 4913 generic.go:334] "Generic (PLEG): container finished" podID="72b53680-fe1a-44b3-8b5b-e721be53113c" containerID="a01bca6bfdd7d77fcf5688e9f5128d9e7256252038e4637282436651c4d738ef" exitCode=0 Oct 01 13:30:23 crc kubenswrapper[4913]: I1001 13:30:23.842082 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1f2e-account-create-t5r92" event={"ID":"72b53680-fe1a-44b3-8b5b-e721be53113c","Type":"ContainerDied","Data":"a01bca6bfdd7d77fcf5688e9f5128d9e7256252038e4637282436651c4d738ef"} Oct 01 13:30:23 crc kubenswrapper[4913]: I1001 13:30:23.842242 4913 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:30:23 crc kubenswrapper[4913]: I1001 13:30:23.842303 4913 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:30:25 crc kubenswrapper[4913]: I1001 13:30:25.215491 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1f2e-account-create-t5r92" Oct 01 13:30:25 crc kubenswrapper[4913]: I1001 13:30:25.291569 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc7jw\" (UniqueName: \"kubernetes.io/projected/72b53680-fe1a-44b3-8b5b-e721be53113c-kube-api-access-nc7jw\") pod \"72b53680-fe1a-44b3-8b5b-e721be53113c\" (UID: \"72b53680-fe1a-44b3-8b5b-e721be53113c\") " Oct 01 13:30:25 crc kubenswrapper[4913]: I1001 13:30:25.297506 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b53680-fe1a-44b3-8b5b-e721be53113c-kube-api-access-nc7jw" (OuterVolumeSpecName: "kube-api-access-nc7jw") pod "72b53680-fe1a-44b3-8b5b-e721be53113c" (UID: "72b53680-fe1a-44b3-8b5b-e721be53113c"). InnerVolumeSpecName "kube-api-access-nc7jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:30:25 crc kubenswrapper[4913]: I1001 13:30:25.393871 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc7jw\" (UniqueName: \"kubernetes.io/projected/72b53680-fe1a-44b3-8b5b-e721be53113c-kube-api-access-nc7jw\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:25 crc kubenswrapper[4913]: I1001 13:30:25.860381 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1f2e-account-create-t5r92" event={"ID":"72b53680-fe1a-44b3-8b5b-e721be53113c","Type":"ContainerDied","Data":"e10ba38f781e9c89848a29fc03ef3ee4fcff69961f66b5bf64d6483f913e7612"} Oct 01 13:30:25 crc kubenswrapper[4913]: I1001 13:30:25.860427 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e10ba38f781e9c89848a29fc03ef3ee4fcff69961f66b5bf64d6483f913e7612" Oct 01 13:30:25 crc kubenswrapper[4913]: I1001 13:30:25.860429 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1f2e-account-create-t5r92" Oct 01 13:30:26 crc kubenswrapper[4913]: I1001 13:30:26.922077 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 13:30:26 crc kubenswrapper[4913]: I1001 13:30:26.922522 4913 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:30:26 crc kubenswrapper[4913]: I1001 13:30:26.933947 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 13:30:26 crc kubenswrapper[4913]: I1001 13:30:26.934140 4913 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:30:26 crc kubenswrapper[4913]: I1001 13:30:26.945532 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 13:30:26 crc kubenswrapper[4913]: I1001 13:30:26.954928 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.003996 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-grb9h"] Oct 01 13:30:31 crc kubenswrapper[4913]: E1001 13:30:31.006298 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b53680-fe1a-44b3-8b5b-e721be53113c" containerName="mariadb-account-create" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.006423 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b53680-fe1a-44b3-8b5b-e721be53113c" containerName="mariadb-account-create" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.006753 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b53680-fe1a-44b3-8b5b-e721be53113c" containerName="mariadb-account-create" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.007706 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-grb9h" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.010411 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-vmbpp" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.010634 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.014459 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-grb9h"] Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.099336 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4grb6\" (UniqueName: \"kubernetes.io/projected/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-kube-api-access-4grb6\") pod \"manila-db-sync-grb9h\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " pod="openstack/manila-db-sync-grb9h" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.099416 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-job-config-data\") pod \"manila-db-sync-grb9h\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " pod="openstack/manila-db-sync-grb9h" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.099490 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-config-data\") pod \"manila-db-sync-grb9h\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " pod="openstack/manila-db-sync-grb9h" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.099560 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-combined-ca-bundle\") pod \"manila-db-sync-grb9h\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " pod="openstack/manila-db-sync-grb9h" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.200399 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4grb6\" (UniqueName: \"kubernetes.io/projected/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-kube-api-access-4grb6\") pod \"manila-db-sync-grb9h\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " pod="openstack/manila-db-sync-grb9h" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.200496 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-job-config-data\") pod \"manila-db-sync-grb9h\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " pod="openstack/manila-db-sync-grb9h" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.200534 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-config-data\") pod \"manila-db-sync-grb9h\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " pod="openstack/manila-db-sync-grb9h" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.200601 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-combined-ca-bundle\") pod \"manila-db-sync-grb9h\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " pod="openstack/manila-db-sync-grb9h" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.206249 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-job-config-data\") pod \"manila-db-sync-grb9h\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " pod="openstack/manila-db-sync-grb9h" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.207863 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-combined-ca-bundle\") pod \"manila-db-sync-grb9h\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " pod="openstack/manila-db-sync-grb9h" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.207970 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-config-data\") pod \"manila-db-sync-grb9h\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " pod="openstack/manila-db-sync-grb9h" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.229115 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4grb6\" (UniqueName: \"kubernetes.io/projected/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-kube-api-access-4grb6\") pod \"manila-db-sync-grb9h\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " pod="openstack/manila-db-sync-grb9h" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.338739 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-grb9h" Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.851519 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-grb9h"] Oct 01 13:30:31 crc kubenswrapper[4913]: I1001 13:30:31.926354 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-grb9h" event={"ID":"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9","Type":"ContainerStarted","Data":"2afd3514dc63bb09860eb30f6a897ccc9f84bbae121839b8016ca9e95616ece3"} Oct 01 13:30:38 crc kubenswrapper[4913]: I1001 13:30:38.813381 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:30:38 crc kubenswrapper[4913]: E1001 13:30:38.814709 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:30:45 crc kubenswrapper[4913]: I1001 13:30:45.053308 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-grb9h" event={"ID":"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9","Type":"ContainerStarted","Data":"f3f8286a7f2f5f6971d4572620219a09aaa86eb4087cbd6fe7d687cf0ae32f90"} Oct 01 13:30:45 crc kubenswrapper[4913]: I1001 13:30:45.082710 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-grb9h" podStartSLOduration=2.616972708 podStartE2EDuration="15.082686228s" podCreationTimestamp="2025-10-01 13:30:30 +0000 UTC" firstStartedPulling="2025-10-01 13:30:31.859805582 +0000 UTC m=+3163.763281160" lastFinishedPulling="2025-10-01 13:30:44.325519082 +0000 UTC m=+3176.228994680" observedRunningTime="2025-10-01 13:30:45.072503352 +0000 UTC m=+3176.975978950" watchObservedRunningTime="2025-10-01 13:30:45.082686228 +0000 UTC m=+3176.986161796" Oct 01 13:30:50 crc kubenswrapper[4913]: I1001 13:30:50.807019 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:30:51 crc kubenswrapper[4913]: I1001 13:30:51.108424 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"c6822e12a17aae1690bae98ddc4a0974be6566860672dde11dc55972de0232d9"} Oct 01 13:31:58 crc kubenswrapper[4913]: I1001 13:31:58.759018 4913 generic.go:334] "Generic (PLEG): container finished" podID="4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9" containerID="f3f8286a7f2f5f6971d4572620219a09aaa86eb4087cbd6fe7d687cf0ae32f90" exitCode=0 Oct 01 13:31:58 crc kubenswrapper[4913]: I1001 13:31:58.759107 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-grb9h" event={"ID":"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9","Type":"ContainerDied","Data":"f3f8286a7f2f5f6971d4572620219a09aaa86eb4087cbd6fe7d687cf0ae32f90"} Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.186748 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-grb9h" Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.352794 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-config-data\") pod \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.353132 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4grb6\" (UniqueName: \"kubernetes.io/projected/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-kube-api-access-4grb6\") pod \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.353171 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-job-config-data\") pod \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.353319 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-combined-ca-bundle\") pod \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\" (UID: \"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9\") " Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.359638 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9" (UID: "4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.359670 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-kube-api-access-4grb6" (OuterVolumeSpecName: "kube-api-access-4grb6") pod "4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9" (UID: "4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9"). InnerVolumeSpecName "kube-api-access-4grb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.368970 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-config-data" (OuterVolumeSpecName: "config-data") pod "4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9" (UID: "4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.379944 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9" (UID: "4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.456168 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.456214 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.456233 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4grb6\" (UniqueName: \"kubernetes.io/projected/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-kube-api-access-4grb6\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.456252 4913 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.775788 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-grb9h" event={"ID":"4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9","Type":"ContainerDied","Data":"2afd3514dc63bb09860eb30f6a897ccc9f84bbae121839b8016ca9e95616ece3"} Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.775834 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2afd3514dc63bb09860eb30f6a897ccc9f84bbae121839b8016ca9e95616ece3" Oct 01 13:32:00 crc kubenswrapper[4913]: I1001 13:32:00.775842 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-grb9h" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.062480 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 13:32:01 crc kubenswrapper[4913]: E1001 13:32:01.062856 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9" containerName="manila-db-sync" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.062871 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9" containerName="manila-db-sync" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.063047 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9" containerName="manila-db-sync" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.064006 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.067218 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-vmbpp" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.067325 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.067396 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.070112 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.091537 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.093199 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.096684 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.112896 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.137229 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.169924 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.169972 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/dc45efe6-4d81-459a-9338-606b52f920e1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.170001 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xhth\" (UniqueName: \"kubernetes.io/projected/dc45efe6-4d81-459a-9338-606b52f920e1-kube-api-access-5xhth\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.170101 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-config-data\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.170195 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71c0701a-649f-46b9-9245-90d837dc9a28-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.170224 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dc45efe6-4d81-459a-9338-606b52f920e1-ceph\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.170357 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-scripts\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.170412 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cb5b\" (UniqueName: \"kubernetes.io/projected/71c0701a-649f-46b9-9245-90d837dc9a28-kube-api-access-6cb5b\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.170450 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-scripts\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.170507 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc45efe6-4d81-459a-9338-606b52f920e1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.170589 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-config-data\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.170624 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.170679 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.170716 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.223383 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fc75556d9-559t4"] Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.225315 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.239389 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc75556d9-559t4"] Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.273361 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-config-data\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.274249 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.274340 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.274365 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.274399 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.274415 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/dc45efe6-4d81-459a-9338-606b52f920e1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.274441 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xhth\" (UniqueName: \"kubernetes.io/projected/dc45efe6-4d81-459a-9338-606b52f920e1-kube-api-access-5xhth\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.274462 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-config-data\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.274490 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71c0701a-649f-46b9-9245-90d837dc9a28-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.274505 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dc45efe6-4d81-459a-9338-606b52f920e1-ceph\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.274551 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-scripts\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.274575 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cb5b\" (UniqueName: \"kubernetes.io/projected/71c0701a-649f-46b9-9245-90d837dc9a28-kube-api-access-6cb5b\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.274595 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-scripts\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.274632 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc45efe6-4d81-459a-9338-606b52f920e1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.274718 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc45efe6-4d81-459a-9338-606b52f920e1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.276414 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/dc45efe6-4d81-459a-9338-606b52f920e1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.277694 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71c0701a-649f-46b9-9245-90d837dc9a28-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.281790 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-scripts\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.281904 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.283137 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-scripts\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.288532 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-config-data\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.289037 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.289250 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-config-data\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.289534 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.289644 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dc45efe6-4d81-459a-9338-606b52f920e1-ceph\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.289996 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.305794 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cb5b\" (UniqueName: \"kubernetes.io/projected/71c0701a-649f-46b9-9245-90d837dc9a28-kube-api-access-6cb5b\") pod \"manila-scheduler-0\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.314970 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xhth\" (UniqueName: \"kubernetes.io/projected/dc45efe6-4d81-459a-9338-606b52f920e1-kube-api-access-5xhth\") pod \"manila-share-share1-0\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.366968 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.368581 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.371882 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.376410 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62svz\" (UniqueName: \"kubernetes.io/projected/0908cda5-711c-4449-90fb-2fd7f524b0db-kube-api-access-62svz\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.376478 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-config\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.376544 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.376691 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.376947 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-dns-svc\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.377003 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.379108 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.388369 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.417710 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.480533 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d56123e-4583-4eae-9f6b-76069c2255f0-logs\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.480588 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-dns-svc\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.480614 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-config-data-custom\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.480636 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.480676 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62svz\" (UniqueName: \"kubernetes.io/projected/0908cda5-711c-4449-90fb-2fd7f524b0db-kube-api-access-62svz\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.480698 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8m5q\" (UniqueName: \"kubernetes.io/projected/5d56123e-4583-4eae-9f6b-76069c2255f0-kube-api-access-m8m5q\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.480715 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.480729 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-config-data\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.480750 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-config\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.480766 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-scripts\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.480792 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.480811 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d56123e-4583-4eae-9f6b-76069c2255f0-etc-machine-id\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.480850 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.481700 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.481998 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-config\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.482238 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.482639 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-dns-svc\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.483058 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0908cda5-711c-4449-90fb-2fd7f524b0db-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.505029 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62svz\" (UniqueName: \"kubernetes.io/projected/0908cda5-711c-4449-90fb-2fd7f524b0db-kube-api-access-62svz\") pod \"dnsmasq-dns-7fc75556d9-559t4\" (UID: \"0908cda5-711c-4449-90fb-2fd7f524b0db\") " pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.544753 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.586507 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d56123e-4583-4eae-9f6b-76069c2255f0-etc-machine-id\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.586662 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d56123e-4583-4eae-9f6b-76069c2255f0-logs\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.586713 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-config-data-custom\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.586810 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8m5q\" (UniqueName: \"kubernetes.io/projected/5d56123e-4583-4eae-9f6b-76069c2255f0-kube-api-access-m8m5q\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.586846 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-config-data\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.586867 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.586903 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-scripts\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.588938 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d56123e-4583-4eae-9f6b-76069c2255f0-etc-machine-id\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.589354 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d56123e-4583-4eae-9f6b-76069c2255f0-logs\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.591294 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-scripts\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.593165 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-config-data-custom\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.596783 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.596981 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-config-data\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.616893 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8m5q\" (UniqueName: \"kubernetes.io/projected/5d56123e-4583-4eae-9f6b-76069c2255f0-kube-api-access-m8m5q\") pod \"manila-api-0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " pod="openstack/manila-api-0" Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.753904 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.851420 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"71c0701a-649f-46b9-9245-90d837dc9a28","Type":"ContainerStarted","Data":"5e9c7e84fb1e14d36c9e5b6d064eaa50fc1d0b1a480d3196425811769eb5d9a9"} Oct 01 13:32:01 crc kubenswrapper[4913]: I1001 13:32:01.864737 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 01 13:32:02 crc kubenswrapper[4913]: W1001 13:32:01.927881 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc45efe6_4d81_459a_9338_606b52f920e1.slice/crio-3aff7431168fbed411f41f97d2c41b14979c314736f2e8a3836a74a57e9f63f2 WatchSource:0}: Error finding container 3aff7431168fbed411f41f97d2c41b14979c314736f2e8a3836a74a57e9f63f2: Status 404 returned error can't find the container with id 3aff7431168fbed411f41f97d2c41b14979c314736f2e8a3836a74a57e9f63f2 Oct 01 13:32:02 crc kubenswrapper[4913]: I1001 13:32:01.928252 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 13:32:02 crc kubenswrapper[4913]: I1001 13:32:02.122248 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc75556d9-559t4"] Oct 01 13:32:02 crc kubenswrapper[4913]: W1001 13:32:02.128815 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0908cda5_711c_4449_90fb_2fd7f524b0db.slice/crio-053f8235eba0def04f7b939d622f8a883286e6df7c8119b03e24dc68dd7be06f WatchSource:0}: Error finding container 053f8235eba0def04f7b939d622f8a883286e6df7c8119b03e24dc68dd7be06f: Status 404 returned error can't find the container with id 053f8235eba0def04f7b939d622f8a883286e6df7c8119b03e24dc68dd7be06f Oct 01 13:32:02 crc kubenswrapper[4913]: I1001 13:32:02.876048 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"dc45efe6-4d81-459a-9338-606b52f920e1","Type":"ContainerStarted","Data":"3aff7431168fbed411f41f97d2c41b14979c314736f2e8a3836a74a57e9f63f2"} Oct 01 13:32:02 crc kubenswrapper[4913]: I1001 13:32:02.879260 4913 generic.go:334] "Generic (PLEG): container finished" podID="0908cda5-711c-4449-90fb-2fd7f524b0db" containerID="09ca662f2940f07074c83231770acb11f225ac65677a8686ce6efc2adaba8cad" exitCode=0 Oct 01 13:32:02 crc kubenswrapper[4913]: I1001 13:32:02.879330 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc75556d9-559t4" event={"ID":"0908cda5-711c-4449-90fb-2fd7f524b0db","Type":"ContainerDied","Data":"09ca662f2940f07074c83231770acb11f225ac65677a8686ce6efc2adaba8cad"} Oct 01 13:32:02 crc kubenswrapper[4913]: I1001 13:32:02.879362 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc75556d9-559t4" event={"ID":"0908cda5-711c-4449-90fb-2fd7f524b0db","Type":"ContainerStarted","Data":"053f8235eba0def04f7b939d622f8a883286e6df7c8119b03e24dc68dd7be06f"} Oct 01 13:32:03 crc kubenswrapper[4913]: I1001 13:32:03.236651 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 01 13:32:03 crc kubenswrapper[4913]: W1001 13:32:03.257252 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d56123e_4583_4eae_9f6b_76069c2255f0.slice/crio-4e389675891a3beae1c12ea263d91d472f62fddd59ca28be69949dd978ff0f5d WatchSource:0}: Error finding container 4e389675891a3beae1c12ea263d91d472f62fddd59ca28be69949dd978ff0f5d: Status 404 returned error can't find the container with id 4e389675891a3beae1c12ea263d91d472f62fddd59ca28be69949dd978ff0f5d Oct 01 13:32:03 crc kubenswrapper[4913]: I1001 13:32:03.895530 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"5d56123e-4583-4eae-9f6b-76069c2255f0","Type":"ContainerStarted","Data":"711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9"} Oct 01 13:32:03 crc kubenswrapper[4913]: I1001 13:32:03.895944 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"5d56123e-4583-4eae-9f6b-76069c2255f0","Type":"ContainerStarted","Data":"4e389675891a3beae1c12ea263d91d472f62fddd59ca28be69949dd978ff0f5d"} Oct 01 13:32:03 crc kubenswrapper[4913]: I1001 13:32:03.904603 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"71c0701a-649f-46b9-9245-90d837dc9a28","Type":"ContainerStarted","Data":"e8050bee28a591da5b0e35e5768ed6965cb04b322f14d7f47c34af9770801e1d"} Oct 01 13:32:03 crc kubenswrapper[4913]: I1001 13:32:03.909229 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc75556d9-559t4" event={"ID":"0908cda5-711c-4449-90fb-2fd7f524b0db","Type":"ContainerStarted","Data":"5c0828c634d3d77a9fa106dcb18490545fa8d8cd472ae4dd34538b6a6a0f5c63"} Oct 01 13:32:03 crc kubenswrapper[4913]: I1001 13:32:03.911218 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:03 crc kubenswrapper[4913]: I1001 13:32:03.944659 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 01 13:32:03 crc kubenswrapper[4913]: I1001 13:32:03.950695 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fc75556d9-559t4" podStartSLOduration=2.950673655 podStartE2EDuration="2.950673655s" podCreationTimestamp="2025-10-01 13:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:32:03.931261598 +0000 UTC m=+3255.834737196" watchObservedRunningTime="2025-10-01 13:32:03.950673655 +0000 UTC m=+3255.854149223" Oct 01 13:32:04 crc kubenswrapper[4913]: I1001 13:32:04.926823 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"5d56123e-4583-4eae-9f6b-76069c2255f0","Type":"ContainerStarted","Data":"8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7"} Oct 01 13:32:04 crc kubenswrapper[4913]: I1001 13:32:04.927177 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 01 13:32:04 crc kubenswrapper[4913]: I1001 13:32:04.927047 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="5d56123e-4583-4eae-9f6b-76069c2255f0" containerName="manila-api" containerID="cri-o://8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7" gracePeriod=30 Oct 01 13:32:04 crc kubenswrapper[4913]: I1001 13:32:04.926958 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="5d56123e-4583-4eae-9f6b-76069c2255f0" containerName="manila-api-log" containerID="cri-o://711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9" gracePeriod=30 Oct 01 13:32:04 crc kubenswrapper[4913]: I1001 13:32:04.929627 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"71c0701a-649f-46b9-9245-90d837dc9a28","Type":"ContainerStarted","Data":"1bb4793c57b992463244b984d875b2a9f9e29ca45e32eda99dd8a70db1bf809c"} Oct 01 13:32:04 crc kubenswrapper[4913]: I1001 13:32:04.952178 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.952035999 podStartE2EDuration="3.952035999s" podCreationTimestamp="2025-10-01 13:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:32:04.944197772 +0000 UTC m=+3256.847673370" watchObservedRunningTime="2025-10-01 13:32:04.952035999 +0000 UTC m=+3256.855511577" Oct 01 13:32:04 crc kubenswrapper[4913]: I1001 13:32:04.976654 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.942665113 podStartE2EDuration="3.976632902s" podCreationTimestamp="2025-10-01 13:32:01 +0000 UTC" firstStartedPulling="2025-10-01 13:32:01.755392028 +0000 UTC m=+3253.658867606" lastFinishedPulling="2025-10-01 13:32:02.789359817 +0000 UTC m=+3254.692835395" observedRunningTime="2025-10-01 13:32:04.966705972 +0000 UTC m=+3256.870181570" watchObservedRunningTime="2025-10-01 13:32:04.976632902 +0000 UTC m=+3256.880108480" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.650085 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.809394 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d56123e-4583-4eae-9f6b-76069c2255f0-etc-machine-id\") pod \"5d56123e-4583-4eae-9f6b-76069c2255f0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.809545 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d56123e-4583-4eae-9f6b-76069c2255f0-logs\") pod \"5d56123e-4583-4eae-9f6b-76069c2255f0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.809581 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-combined-ca-bundle\") pod \"5d56123e-4583-4eae-9f6b-76069c2255f0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.809628 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-config-data-custom\") pod \"5d56123e-4583-4eae-9f6b-76069c2255f0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.809707 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-scripts\") pod \"5d56123e-4583-4eae-9f6b-76069c2255f0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.809816 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8m5q\" (UniqueName: \"kubernetes.io/projected/5d56123e-4583-4eae-9f6b-76069c2255f0-kube-api-access-m8m5q\") pod \"5d56123e-4583-4eae-9f6b-76069c2255f0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.809929 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-config-data\") pod \"5d56123e-4583-4eae-9f6b-76069c2255f0\" (UID: \"5d56123e-4583-4eae-9f6b-76069c2255f0\") " Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.811869 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d56123e-4583-4eae-9f6b-76069c2255f0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5d56123e-4583-4eae-9f6b-76069c2255f0" (UID: "5d56123e-4583-4eae-9f6b-76069c2255f0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.813498 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d56123e-4583-4eae-9f6b-76069c2255f0-logs" (OuterVolumeSpecName: "logs") pod "5d56123e-4583-4eae-9f6b-76069c2255f0" (UID: "5d56123e-4583-4eae-9f6b-76069c2255f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.849524 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d56123e-4583-4eae-9f6b-76069c2255f0-kube-api-access-m8m5q" (OuterVolumeSpecName: "kube-api-access-m8m5q") pod "5d56123e-4583-4eae-9f6b-76069c2255f0" (UID: "5d56123e-4583-4eae-9f6b-76069c2255f0"). InnerVolumeSpecName "kube-api-access-m8m5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.850143 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-scripts" (OuterVolumeSpecName: "scripts") pod "5d56123e-4583-4eae-9f6b-76069c2255f0" (UID: "5d56123e-4583-4eae-9f6b-76069c2255f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.854028 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5d56123e-4583-4eae-9f6b-76069c2255f0" (UID: "5d56123e-4583-4eae-9f6b-76069c2255f0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.873776 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d56123e-4583-4eae-9f6b-76069c2255f0" (UID: "5d56123e-4583-4eae-9f6b-76069c2255f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.894988 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-config-data" (OuterVolumeSpecName: "config-data") pod "5d56123e-4583-4eae-9f6b-76069c2255f0" (UID: "5d56123e-4583-4eae-9f6b-76069c2255f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.916985 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.917029 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8m5q\" (UniqueName: \"kubernetes.io/projected/5d56123e-4583-4eae-9f6b-76069c2255f0-kube-api-access-m8m5q\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.917044 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.917057 4913 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d56123e-4583-4eae-9f6b-76069c2255f0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.917070 4913 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d56123e-4583-4eae-9f6b-76069c2255f0-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.917080 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.917092 4913 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d56123e-4583-4eae-9f6b-76069c2255f0-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.946619 4913 generic.go:334] "Generic (PLEG): container finished" podID="5d56123e-4583-4eae-9f6b-76069c2255f0" containerID="8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7" exitCode=0 Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.946661 4913 generic.go:334] "Generic (PLEG): container finished" podID="5d56123e-4583-4eae-9f6b-76069c2255f0" containerID="711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9" exitCode=143 Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.947727 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.955577 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"5d56123e-4583-4eae-9f6b-76069c2255f0","Type":"ContainerDied","Data":"8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7"} Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.955639 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"5d56123e-4583-4eae-9f6b-76069c2255f0","Type":"ContainerDied","Data":"711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9"} Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.955657 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"5d56123e-4583-4eae-9f6b-76069c2255f0","Type":"ContainerDied","Data":"4e389675891a3beae1c12ea263d91d472f62fddd59ca28be69949dd978ff0f5d"} Oct 01 13:32:05 crc kubenswrapper[4913]: I1001 13:32:05.955681 4913 scope.go:117] "RemoveContainer" containerID="8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.032317 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.032369 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.048078 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 01 13:32:06 crc kubenswrapper[4913]: E1001 13:32:06.048581 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d56123e-4583-4eae-9f6b-76069c2255f0" containerName="manila-api" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.048603 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d56123e-4583-4eae-9f6b-76069c2255f0" containerName="manila-api" Oct 01 13:32:06 crc kubenswrapper[4913]: E1001 13:32:06.048618 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d56123e-4583-4eae-9f6b-76069c2255f0" containerName="manila-api-log" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.048623 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d56123e-4583-4eae-9f6b-76069c2255f0" containerName="manila-api-log" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.048816 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d56123e-4583-4eae-9f6b-76069c2255f0" containerName="manila-api-log" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.048840 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d56123e-4583-4eae-9f6b-76069c2255f0" containerName="manila-api" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.049808 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.051923 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.052645 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.054067 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.055338 4913 scope.go:117] "RemoveContainer" containerID="711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.061583 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.086335 4913 scope.go:117] "RemoveContainer" containerID="8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7" Oct 01 13:32:06 crc kubenswrapper[4913]: E1001 13:32:06.086751 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7\": container with ID starting with 8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7 not found: ID does not exist" containerID="8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.086789 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7"} err="failed to get container status \"8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7\": rpc error: code = NotFound desc = could not find container \"8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7\": container with ID starting with 8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7 not found: ID does not exist" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.086822 4913 scope.go:117] "RemoveContainer" containerID="711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9" Oct 01 13:32:06 crc kubenswrapper[4913]: E1001 13:32:06.087743 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9\": container with ID starting with 711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9 not found: ID does not exist" containerID="711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.087769 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9"} err="failed to get container status \"711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9\": rpc error: code = NotFound desc = could not find container \"711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9\": container with ID starting with 711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9 not found: ID does not exist" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.087784 4913 scope.go:117] "RemoveContainer" containerID="8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.088457 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7"} err="failed to get container status \"8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7\": rpc error: code = NotFound desc = could not find container \"8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7\": container with ID starting with 8c30e95270592ac339e879b612759eb5a45d6d1117370ec1cc257242261707b7 not found: ID does not exist" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.088479 4913 scope.go:117] "RemoveContainer" containerID="711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.088890 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9"} err="failed to get container status \"711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9\": rpc error: code = NotFound desc = could not find container \"711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9\": container with ID starting with 711b0a212996d387403ace191221dcc41ff24fe1ed28381bf6d77f5a9225b5e9 not found: ID does not exist" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.141185 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shrzh\" (UniqueName: \"kubernetes.io/projected/cb953036-cada-47b8-8f60-6a7df072c7e2-kube-api-access-shrzh\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.141282 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-scripts\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.141320 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-config-data-custom\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.141364 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb953036-cada-47b8-8f60-6a7df072c7e2-logs\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.141385 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb953036-cada-47b8-8f60-6a7df072c7e2-etc-machine-id\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.141405 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.141423 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-public-tls-certs\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.141454 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-config-data\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.141470 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-internal-tls-certs\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.242647 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-scripts\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.242696 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-config-data-custom\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.242729 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb953036-cada-47b8-8f60-6a7df072c7e2-logs\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.242750 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb953036-cada-47b8-8f60-6a7df072c7e2-etc-machine-id\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.242774 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.242797 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-public-tls-certs\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.242823 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-config-data\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.242837 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-internal-tls-certs\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.242913 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shrzh\" (UniqueName: \"kubernetes.io/projected/cb953036-cada-47b8-8f60-6a7df072c7e2-kube-api-access-shrzh\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.243503 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb953036-cada-47b8-8f60-6a7df072c7e2-logs\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.243540 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb953036-cada-47b8-8f60-6a7df072c7e2-etc-machine-id\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.247183 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-scripts\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.247845 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-config-data\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.248765 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-public-tls-certs\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.249406 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.249446 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-config-data-custom\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.258141 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb953036-cada-47b8-8f60-6a7df072c7e2-internal-tls-certs\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.263072 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shrzh\" (UniqueName: \"kubernetes.io/projected/cb953036-cada-47b8-8f60-6a7df072c7e2-kube-api-access-shrzh\") pod \"manila-api-0\" (UID: \"cb953036-cada-47b8-8f60-6a7df072c7e2\") " pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.376755 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.800184 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.800810 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="ceilometer-central-agent" containerID="cri-o://1d25ae0bccec6ac70640a85460308cfb578e713ec810f1442b3d4ddf5bfb4376" gracePeriod=30 Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.800972 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="proxy-httpd" containerID="cri-o://8ba8adfe345f66348211f5072e29facd580bdc6a6c18466c7d0bf1ada230a67e" gracePeriod=30 Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.801016 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="sg-core" containerID="cri-o://e26ec81d006b97f0751099a25aeb00dacbd36dda58452ba98b53767e3c31f188" gracePeriod=30 Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.801060 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="ceilometer-notification-agent" containerID="cri-o://ecd6b515d802897b63c90bd65f0970c65293b8d85345fa6c742603b43a56f55b" gracePeriod=30 Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.823249 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d56123e-4583-4eae-9f6b-76069c2255f0" path="/var/lib/kubelet/pods/5d56123e-4583-4eae-9f6b-76069c2255f0/volumes" Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.946424 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.969445 4913 generic.go:334] "Generic (PLEG): container finished" podID="32bdf40f-f949-40cf-8416-3a032c521d59" containerID="e26ec81d006b97f0751099a25aeb00dacbd36dda58452ba98b53767e3c31f188" exitCode=2 Oct 01 13:32:06 crc kubenswrapper[4913]: I1001 13:32:06.969496 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32bdf40f-f949-40cf-8416-3a032c521d59","Type":"ContainerDied","Data":"e26ec81d006b97f0751099a25aeb00dacbd36dda58452ba98b53767e3c31f188"} Oct 01 13:32:07 crc kubenswrapper[4913]: I1001 13:32:07.980725 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"cb953036-cada-47b8-8f60-6a7df072c7e2","Type":"ContainerStarted","Data":"86148bdff420197c20e7edaf0eb9ce277006e53fa51e028328cc5a22105493dc"} Oct 01 13:32:07 crc kubenswrapper[4913]: I1001 13:32:07.981034 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"cb953036-cada-47b8-8f60-6a7df072c7e2","Type":"ContainerStarted","Data":"d5ca8798649dd90437920b6722467dd38a124b777ab700085b5606095eb4d3e2"} Oct 01 13:32:07 crc kubenswrapper[4913]: I1001 13:32:07.983637 4913 generic.go:334] "Generic (PLEG): container finished" podID="32bdf40f-f949-40cf-8416-3a032c521d59" containerID="8ba8adfe345f66348211f5072e29facd580bdc6a6c18466c7d0bf1ada230a67e" exitCode=0 Oct 01 13:32:07 crc kubenswrapper[4913]: I1001 13:32:07.983663 4913 generic.go:334] "Generic (PLEG): container finished" podID="32bdf40f-f949-40cf-8416-3a032c521d59" containerID="ecd6b515d802897b63c90bd65f0970c65293b8d85345fa6c742603b43a56f55b" exitCode=0 Oct 01 13:32:07 crc kubenswrapper[4913]: I1001 13:32:07.983673 4913 generic.go:334] "Generic (PLEG): container finished" podID="32bdf40f-f949-40cf-8416-3a032c521d59" containerID="1d25ae0bccec6ac70640a85460308cfb578e713ec810f1442b3d4ddf5bfb4376" exitCode=0 Oct 01 13:32:07 crc kubenswrapper[4913]: I1001 13:32:07.983692 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32bdf40f-f949-40cf-8416-3a032c521d59","Type":"ContainerDied","Data":"8ba8adfe345f66348211f5072e29facd580bdc6a6c18466c7d0bf1ada230a67e"} Oct 01 13:32:07 crc kubenswrapper[4913]: I1001 13:32:07.983710 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32bdf40f-f949-40cf-8416-3a032c521d59","Type":"ContainerDied","Data":"ecd6b515d802897b63c90bd65f0970c65293b8d85345fa6c742603b43a56f55b"} Oct 01 13:32:07 crc kubenswrapper[4913]: I1001 13:32:07.983721 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32bdf40f-f949-40cf-8416-3a032c521d59","Type":"ContainerDied","Data":"1d25ae0bccec6ac70640a85460308cfb578e713ec810f1442b3d4ddf5bfb4376"} Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.628449 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.729995 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-config-data\") pod \"32bdf40f-f949-40cf-8416-3a032c521d59\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.730464 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bdf40f-f949-40cf-8416-3a032c521d59-log-httpd\") pod \"32bdf40f-f949-40cf-8416-3a032c521d59\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.730497 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-scripts\") pod \"32bdf40f-f949-40cf-8416-3a032c521d59\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.730580 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j9x8\" (UniqueName: \"kubernetes.io/projected/32bdf40f-f949-40cf-8416-3a032c521d59-kube-api-access-8j9x8\") pod \"32bdf40f-f949-40cf-8416-3a032c521d59\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.730630 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-ceilometer-tls-certs\") pod \"32bdf40f-f949-40cf-8416-3a032c521d59\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.730659 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-sg-core-conf-yaml\") pod \"32bdf40f-f949-40cf-8416-3a032c521d59\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.730693 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-combined-ca-bundle\") pod \"32bdf40f-f949-40cf-8416-3a032c521d59\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.730750 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bdf40f-f949-40cf-8416-3a032c521d59-run-httpd\") pod \"32bdf40f-f949-40cf-8416-3a032c521d59\" (UID: \"32bdf40f-f949-40cf-8416-3a032c521d59\") " Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.732401 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32bdf40f-f949-40cf-8416-3a032c521d59-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "32bdf40f-f949-40cf-8416-3a032c521d59" (UID: "32bdf40f-f949-40cf-8416-3a032c521d59"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.733236 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32bdf40f-f949-40cf-8416-3a032c521d59-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "32bdf40f-f949-40cf-8416-3a032c521d59" (UID: "32bdf40f-f949-40cf-8416-3a032c521d59"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.745732 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-scripts" (OuterVolumeSpecName: "scripts") pod "32bdf40f-f949-40cf-8416-3a032c521d59" (UID: "32bdf40f-f949-40cf-8416-3a032c521d59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.745712 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32bdf40f-f949-40cf-8416-3a032c521d59-kube-api-access-8j9x8" (OuterVolumeSpecName: "kube-api-access-8j9x8") pod "32bdf40f-f949-40cf-8416-3a032c521d59" (UID: "32bdf40f-f949-40cf-8416-3a032c521d59"). InnerVolumeSpecName "kube-api-access-8j9x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.787577 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "32bdf40f-f949-40cf-8416-3a032c521d59" (UID: "32bdf40f-f949-40cf-8416-3a032c521d59"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.819378 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "32bdf40f-f949-40cf-8416-3a032c521d59" (UID: "32bdf40f-f949-40cf-8416-3a032c521d59"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.833060 4913 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bdf40f-f949-40cf-8416-3a032c521d59-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.833102 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.833116 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j9x8\" (UniqueName: \"kubernetes.io/projected/32bdf40f-f949-40cf-8416-3a032c521d59-kube-api-access-8j9x8\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.833131 4913 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.833143 4913 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.833154 4913 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bdf40f-f949-40cf-8416-3a032c521d59-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.839904 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32bdf40f-f949-40cf-8416-3a032c521d59" (UID: "32bdf40f-f949-40cf-8416-3a032c521d59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.874635 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-config-data" (OuterVolumeSpecName: "config-data") pod "32bdf40f-f949-40cf-8416-3a032c521d59" (UID: "32bdf40f-f949-40cf-8416-3a032c521d59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.935193 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:10 crc kubenswrapper[4913]: I1001 13:32:10.935230 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bdf40f-f949-40cf-8416-3a032c521d59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.010714 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"cb953036-cada-47b8-8f60-6a7df072c7e2","Type":"ContainerStarted","Data":"c7a23a073d22a9dd983a0027a5298e1168c90790048a7cc2b2709827aec6807f"} Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.010836 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.013070 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32bdf40f-f949-40cf-8416-3a032c521d59","Type":"ContainerDied","Data":"739279761b1046f87b319836bfdbc90584e977ad485292e668f73b680eac4fb1"} Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.013101 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.013128 4913 scope.go:117] "RemoveContainer" containerID="8ba8adfe345f66348211f5072e29facd580bdc6a6c18466c7d0bf1ada230a67e" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.049239 4913 scope.go:117] "RemoveContainer" containerID="e26ec81d006b97f0751099a25aeb00dacbd36dda58452ba98b53767e3c31f188" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.052169 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=5.052147153 podStartE2EDuration="5.052147153s" podCreationTimestamp="2025-10-01 13:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:32:11.029497099 +0000 UTC m=+3262.932972697" watchObservedRunningTime="2025-10-01 13:32:11.052147153 +0000 UTC m=+3262.955622731" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.074953 4913 scope.go:117] "RemoveContainer" containerID="ecd6b515d802897b63c90bd65f0970c65293b8d85345fa6c742603b43a56f55b" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.076438 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.095610 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.123008 4913 scope.go:117] "RemoveContainer" containerID="1d25ae0bccec6ac70640a85460308cfb578e713ec810f1442b3d4ddf5bfb4376" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.126145 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:32:11 crc kubenswrapper[4913]: E1001 13:32:11.126526 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="ceilometer-central-agent" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.126539 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="ceilometer-central-agent" Oct 01 13:32:11 crc kubenswrapper[4913]: E1001 13:32:11.126566 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="ceilometer-notification-agent" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.126575 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="ceilometer-notification-agent" Oct 01 13:32:11 crc kubenswrapper[4913]: E1001 13:32:11.126595 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="sg-core" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.126601 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="sg-core" Oct 01 13:32:11 crc kubenswrapper[4913]: E1001 13:32:11.126615 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="proxy-httpd" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.126621 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="proxy-httpd" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.126792 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="ceilometer-central-agent" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.126806 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="proxy-httpd" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.126818 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="sg-core" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.126837 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" containerName="ceilometer-notification-agent" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.128489 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.130653 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.130799 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.130954 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.146427 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.240914 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe5439a-1b24-4212-a088-0d0787144197-log-httpd\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.240981 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-scripts\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.241018 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.241035 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.241053 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.241079 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gczjq\" (UniqueName: \"kubernetes.io/projected/8fe5439a-1b24-4212-a088-0d0787144197-kube-api-access-gczjq\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.241094 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-config-data\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.241152 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe5439a-1b24-4212-a088-0d0787144197-run-httpd\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.342418 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe5439a-1b24-4212-a088-0d0787144197-log-httpd\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.342474 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-scripts\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.342513 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.342533 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.342551 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.342673 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gczjq\" (UniqueName: \"kubernetes.io/projected/8fe5439a-1b24-4212-a088-0d0787144197-kube-api-access-gczjq\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.342704 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-config-data\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.342775 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe5439a-1b24-4212-a088-0d0787144197-run-httpd\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.342929 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe5439a-1b24-4212-a088-0d0787144197-log-httpd\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.343175 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe5439a-1b24-4212-a088-0d0787144197-run-httpd\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.347452 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.348124 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.348601 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.349658 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-scripts\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.356968 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-config-data\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.362916 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gczjq\" (UniqueName: \"kubernetes.io/projected/8fe5439a-1b24-4212-a088-0d0787144197-kube-api-access-gczjq\") pod \"ceilometer-0\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.380790 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.466081 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.546445 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fc75556d9-559t4" Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.663339 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84968f68f7-g5z7n"] Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.664232 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" podUID="0e9d019d-bd66-4323-843b-e52e0efa0771" containerName="dnsmasq-dns" containerID="cri-o://b102f83d21ae82772b304d7ba9cfde3ed5facfc0a16075ca87c13a6ee21f2071" gracePeriod=10 Oct 01 13:32:11 crc kubenswrapper[4913]: I1001 13:32:11.965552 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:32:11 crc kubenswrapper[4913]: W1001 13:32:11.976760 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fe5439a_1b24_4212_a088_0d0787144197.slice/crio-5d4c9adaa2be763bc8ffdedd5926b8f3c2f8d1dd586f702e24095457a7c86f65 WatchSource:0}: Error finding container 5d4c9adaa2be763bc8ffdedd5926b8f3c2f8d1dd586f702e24095457a7c86f65: Status 404 returned error can't find the container with id 5d4c9adaa2be763bc8ffdedd5926b8f3c2f8d1dd586f702e24095457a7c86f65 Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.029829 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe5439a-1b24-4212-a088-0d0787144197","Type":"ContainerStarted","Data":"5d4c9adaa2be763bc8ffdedd5926b8f3c2f8d1dd586f702e24095457a7c86f65"} Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.031863 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"dc45efe6-4d81-459a-9338-606b52f920e1","Type":"ContainerStarted","Data":"bc731a3db1ea633e5408f439a877220a602a96ee846a928a4e0fa8a6c3cca3b0"} Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.031898 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"dc45efe6-4d81-459a-9338-606b52f920e1","Type":"ContainerStarted","Data":"532f266607864b4db09ec16acf12e1e5fbc60caaf2c7be918319da9cdde7ed13"} Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.042375 4913 generic.go:334] "Generic (PLEG): container finished" podID="0e9d019d-bd66-4323-843b-e52e0efa0771" containerID="b102f83d21ae82772b304d7ba9cfde3ed5facfc0a16075ca87c13a6ee21f2071" exitCode=0 Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.042432 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" event={"ID":"0e9d019d-bd66-4323-843b-e52e0efa0771","Type":"ContainerDied","Data":"b102f83d21ae82772b304d7ba9cfde3ed5facfc0a16075ca87c13a6ee21f2071"} Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.075209 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.435757317 podStartE2EDuration="11.075186812s" podCreationTimestamp="2025-10-01 13:32:01 +0000 UTC" firstStartedPulling="2025-10-01 13:32:01.931166669 +0000 UTC m=+3253.834642247" lastFinishedPulling="2025-10-01 13:32:10.570596164 +0000 UTC m=+3262.474071742" observedRunningTime="2025-10-01 13:32:12.059021883 +0000 UTC m=+3263.962497481" watchObservedRunningTime="2025-10-01 13:32:12.075186812 +0000 UTC m=+3263.978662390" Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.195123 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.384917 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-config\") pod \"0e9d019d-bd66-4323-843b-e52e0efa0771\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.385492 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgjsp\" (UniqueName: \"kubernetes.io/projected/0e9d019d-bd66-4323-843b-e52e0efa0771-kube-api-access-wgjsp\") pod \"0e9d019d-bd66-4323-843b-e52e0efa0771\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.385567 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-openstack-edpm-ipam\") pod \"0e9d019d-bd66-4323-843b-e52e0efa0771\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.385591 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-ovsdbserver-sb\") pod \"0e9d019d-bd66-4323-843b-e52e0efa0771\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.385648 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-ovsdbserver-nb\") pod \"0e9d019d-bd66-4323-843b-e52e0efa0771\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.385799 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-dns-svc\") pod \"0e9d019d-bd66-4323-843b-e52e0efa0771\" (UID: \"0e9d019d-bd66-4323-843b-e52e0efa0771\") " Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.409207 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9d019d-bd66-4323-843b-e52e0efa0771-kube-api-access-wgjsp" (OuterVolumeSpecName: "kube-api-access-wgjsp") pod "0e9d019d-bd66-4323-843b-e52e0efa0771" (UID: "0e9d019d-bd66-4323-843b-e52e0efa0771"). InnerVolumeSpecName "kube-api-access-wgjsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.458965 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0e9d019d-bd66-4323-843b-e52e0efa0771" (UID: "0e9d019d-bd66-4323-843b-e52e0efa0771"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.459025 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0e9d019d-bd66-4323-843b-e52e0efa0771" (UID: "0e9d019d-bd66-4323-843b-e52e0efa0771"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.463236 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "0e9d019d-bd66-4323-843b-e52e0efa0771" (UID: "0e9d019d-bd66-4323-843b-e52e0efa0771"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.463743 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0e9d019d-bd66-4323-843b-e52e0efa0771" (UID: "0e9d019d-bd66-4323-843b-e52e0efa0771"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.466538 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-config" (OuterVolumeSpecName: "config") pod "0e9d019d-bd66-4323-843b-e52e0efa0771" (UID: "0e9d019d-bd66-4323-843b-e52e0efa0771"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.489155 4913 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.489200 4913 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.489212 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgjsp\" (UniqueName: \"kubernetes.io/projected/0e9d019d-bd66-4323-843b-e52e0efa0771-kube-api-access-wgjsp\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.489294 4913 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.489307 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.489319 4913 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e9d019d-bd66-4323-843b-e52e0efa0771-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:12 crc kubenswrapper[4913]: I1001 13:32:12.820001 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32bdf40f-f949-40cf-8416-3a032c521d59" path="/var/lib/kubelet/pods/32bdf40f-f949-40cf-8416-3a032c521d59/volumes" Oct 01 13:32:13 crc kubenswrapper[4913]: I1001 13:32:13.053812 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" event={"ID":"0e9d019d-bd66-4323-843b-e52e0efa0771","Type":"ContainerDied","Data":"2d402ae383b84b724d1fd212af3666c27da5b9f5c5a6c893e9e3b42ed1bd1c78"} Oct 01 13:32:13 crc kubenswrapper[4913]: I1001 13:32:13.053851 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84968f68f7-g5z7n" Oct 01 13:32:13 crc kubenswrapper[4913]: I1001 13:32:13.053870 4913 scope.go:117] "RemoveContainer" containerID="b102f83d21ae82772b304d7ba9cfde3ed5facfc0a16075ca87c13a6ee21f2071" Oct 01 13:32:13 crc kubenswrapper[4913]: I1001 13:32:13.079428 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84968f68f7-g5z7n"] Oct 01 13:32:13 crc kubenswrapper[4913]: I1001 13:32:13.087036 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84968f68f7-g5z7n"] Oct 01 13:32:13 crc kubenswrapper[4913]: I1001 13:32:13.116260 4913 scope.go:117] "RemoveContainer" containerID="b59f85f7259eaf2c83e87fd08c188e17d8465b07c926bb259ebb8e94cd972429" Oct 01 13:32:14 crc kubenswrapper[4913]: I1001 13:32:14.064632 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe5439a-1b24-4212-a088-0d0787144197","Type":"ContainerStarted","Data":"149dc5c3d838d43d2dd65373e5769478f0121e6be648abb124b5110bf5c6c9ca"} Oct 01 13:32:14 crc kubenswrapper[4913]: I1001 13:32:14.819828 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9d019d-bd66-4323-843b-e52e0efa0771" path="/var/lib/kubelet/pods/0e9d019d-bd66-4323-843b-e52e0efa0771/volumes" Oct 01 13:32:17 crc kubenswrapper[4913]: I1001 13:32:17.679398 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:32:20 crc kubenswrapper[4913]: I1001 13:32:20.139956 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe5439a-1b24-4212-a088-0d0787144197","Type":"ContainerStarted","Data":"37caa8b073817a2940d2cab5d32115df22a476a6d20df351c0604d9d36ba7647"} Oct 01 13:32:21 crc kubenswrapper[4913]: I1001 13:32:21.160166 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe5439a-1b24-4212-a088-0d0787144197","Type":"ContainerStarted","Data":"5bfe22913f03a481a1e8b313b8b4cfb4c9ff796a0c30348fa46173f34068d8ba"} Oct 01 13:32:21 crc kubenswrapper[4913]: I1001 13:32:21.418974 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 01 13:32:23 crc kubenswrapper[4913]: I1001 13:32:23.517865 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 01 13:32:23 crc kubenswrapper[4913]: I1001 13:32:23.530045 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 01 13:32:23 crc kubenswrapper[4913]: I1001 13:32:23.582633 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 13:32:23 crc kubenswrapper[4913]: I1001 13:32:23.601859 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 13:32:24 crc kubenswrapper[4913]: I1001 13:32:24.191154 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe5439a-1b24-4212-a088-0d0787144197","Type":"ContainerStarted","Data":"1fba055e1c7e2efd2496bdf0c8f7d258dc15d14aa6db184b4da26756db5f25ab"} Oct 01 13:32:24 crc kubenswrapper[4913]: I1001 13:32:24.191257 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="dc45efe6-4d81-459a-9338-606b52f920e1" containerName="manila-share" containerID="cri-o://532f266607864b4db09ec16acf12e1e5fbc60caaf2c7be918319da9cdde7ed13" gracePeriod=30 Oct 01 13:32:24 crc kubenswrapper[4913]: I1001 13:32:24.191557 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="ceilometer-central-agent" containerID="cri-o://149dc5c3d838d43d2dd65373e5769478f0121e6be648abb124b5110bf5c6c9ca" gracePeriod=30 Oct 01 13:32:24 crc kubenswrapper[4913]: I1001 13:32:24.191607 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="proxy-httpd" containerID="cri-o://1fba055e1c7e2efd2496bdf0c8f7d258dc15d14aa6db184b4da26756db5f25ab" gracePeriod=30 Oct 01 13:32:24 crc kubenswrapper[4913]: I1001 13:32:24.191632 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="71c0701a-649f-46b9-9245-90d837dc9a28" containerName="manila-scheduler" containerID="cri-o://e8050bee28a591da5b0e35e5768ed6965cb04b322f14d7f47c34af9770801e1d" gracePeriod=30 Oct 01 13:32:24 crc kubenswrapper[4913]: I1001 13:32:24.191667 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="sg-core" containerID="cri-o://5bfe22913f03a481a1e8b313b8b4cfb4c9ff796a0c30348fa46173f34068d8ba" gracePeriod=30 Oct 01 13:32:24 crc kubenswrapper[4913]: I1001 13:32:24.191676 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="71c0701a-649f-46b9-9245-90d837dc9a28" containerName="probe" containerID="cri-o://1bb4793c57b992463244b984d875b2a9f9e29ca45e32eda99dd8a70db1bf809c" gracePeriod=30 Oct 01 13:32:24 crc kubenswrapper[4913]: I1001 13:32:24.191324 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="dc45efe6-4d81-459a-9338-606b52f920e1" containerName="probe" containerID="cri-o://bc731a3db1ea633e5408f439a877220a602a96ee846a928a4e0fa8a6c3cca3b0" gracePeriod=30 Oct 01 13:32:24 crc kubenswrapper[4913]: I1001 13:32:24.191728 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="ceilometer-notification-agent" containerID="cri-o://37caa8b073817a2940d2cab5d32115df22a476a6d20df351c0604d9d36ba7647" gracePeriod=30 Oct 01 13:32:24 crc kubenswrapper[4913]: I1001 13:32:24.260768 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.652494204 podStartE2EDuration="13.260742516s" podCreationTimestamp="2025-10-01 13:32:11 +0000 UTC" firstStartedPulling="2025-10-01 13:32:11.982160351 +0000 UTC m=+3263.885635929" lastFinishedPulling="2025-10-01 13:32:23.590408663 +0000 UTC m=+3275.493884241" observedRunningTime="2025-10-01 13:32:24.238996039 +0000 UTC m=+3276.142471647" watchObservedRunningTime="2025-10-01 13:32:24.260742516 +0000 UTC m=+3276.164218094" Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.201147 4913 generic.go:334] "Generic (PLEG): container finished" podID="71c0701a-649f-46b9-9245-90d837dc9a28" containerID="1bb4793c57b992463244b984d875b2a9f9e29ca45e32eda99dd8a70db1bf809c" exitCode=0 Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.201175 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"71c0701a-649f-46b9-9245-90d837dc9a28","Type":"ContainerDied","Data":"1bb4793c57b992463244b984d875b2a9f9e29ca45e32eda99dd8a70db1bf809c"} Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.204445 4913 generic.go:334] "Generic (PLEG): container finished" podID="8fe5439a-1b24-4212-a088-0d0787144197" containerID="5bfe22913f03a481a1e8b313b8b4cfb4c9ff796a0c30348fa46173f34068d8ba" exitCode=2 Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.204463 4913 generic.go:334] "Generic (PLEG): container finished" podID="8fe5439a-1b24-4212-a088-0d0787144197" containerID="37caa8b073817a2940d2cab5d32115df22a476a6d20df351c0604d9d36ba7647" exitCode=0 Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.204516 4913 generic.go:334] "Generic (PLEG): container finished" podID="8fe5439a-1b24-4212-a088-0d0787144197" containerID="149dc5c3d838d43d2dd65373e5769478f0121e6be648abb124b5110bf5c6c9ca" exitCode=0 Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.204518 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe5439a-1b24-4212-a088-0d0787144197","Type":"ContainerDied","Data":"5bfe22913f03a481a1e8b313b8b4cfb4c9ff796a0c30348fa46173f34068d8ba"} Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.204541 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe5439a-1b24-4212-a088-0d0787144197","Type":"ContainerDied","Data":"37caa8b073817a2940d2cab5d32115df22a476a6d20df351c0604d9d36ba7647"} Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.204551 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe5439a-1b24-4212-a088-0d0787144197","Type":"ContainerDied","Data":"149dc5c3d838d43d2dd65373e5769478f0121e6be648abb124b5110bf5c6c9ca"} Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.206456 4913 generic.go:334] "Generic (PLEG): container finished" podID="dc45efe6-4d81-459a-9338-606b52f920e1" containerID="bc731a3db1ea633e5408f439a877220a602a96ee846a928a4e0fa8a6c3cca3b0" exitCode=0 Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.206474 4913 generic.go:334] "Generic (PLEG): container finished" podID="dc45efe6-4d81-459a-9338-606b52f920e1" containerID="532f266607864b4db09ec16acf12e1e5fbc60caaf2c7be918319da9cdde7ed13" exitCode=1 Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.206493 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"dc45efe6-4d81-459a-9338-606b52f920e1","Type":"ContainerDied","Data":"bc731a3db1ea633e5408f439a877220a602a96ee846a928a4e0fa8a6c3cca3b0"} Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.206539 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"dc45efe6-4d81-459a-9338-606b52f920e1","Type":"ContainerDied","Data":"532f266607864b4db09ec16acf12e1e5fbc60caaf2c7be918319da9cdde7ed13"} Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.782928 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.957991 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-config-data-custom\") pod \"dc45efe6-4d81-459a-9338-606b52f920e1\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.958180 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-combined-ca-bundle\") pod \"dc45efe6-4d81-459a-9338-606b52f920e1\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.958248 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc45efe6-4d81-459a-9338-606b52f920e1-etc-machine-id\") pod \"dc45efe6-4d81-459a-9338-606b52f920e1\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.958411 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc45efe6-4d81-459a-9338-606b52f920e1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dc45efe6-4d81-459a-9338-606b52f920e1" (UID: "dc45efe6-4d81-459a-9338-606b52f920e1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.958447 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-scripts\") pod \"dc45efe6-4d81-459a-9338-606b52f920e1\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.958472 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dc45efe6-4d81-459a-9338-606b52f920e1-ceph\") pod \"dc45efe6-4d81-459a-9338-606b52f920e1\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.959162 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc45efe6-4d81-459a-9338-606b52f920e1-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "dc45efe6-4d81-459a-9338-606b52f920e1" (UID: "dc45efe6-4d81-459a-9338-606b52f920e1"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.958489 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/dc45efe6-4d81-459a-9338-606b52f920e1-var-lib-manila\") pod \"dc45efe6-4d81-459a-9338-606b52f920e1\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.959284 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-config-data\") pod \"dc45efe6-4d81-459a-9338-606b52f920e1\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.959316 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xhth\" (UniqueName: \"kubernetes.io/projected/dc45efe6-4d81-459a-9338-606b52f920e1-kube-api-access-5xhth\") pod \"dc45efe6-4d81-459a-9338-606b52f920e1\" (UID: \"dc45efe6-4d81-459a-9338-606b52f920e1\") " Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.959891 4913 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/dc45efe6-4d81-459a-9338-606b52f920e1-var-lib-manila\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.959912 4913 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc45efe6-4d81-459a-9338-606b52f920e1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.964056 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dc45efe6-4d81-459a-9338-606b52f920e1" (UID: "dc45efe6-4d81-459a-9338-606b52f920e1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.965648 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc45efe6-4d81-459a-9338-606b52f920e1-ceph" (OuterVolumeSpecName: "ceph") pod "dc45efe6-4d81-459a-9338-606b52f920e1" (UID: "dc45efe6-4d81-459a-9338-606b52f920e1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.971443 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-scripts" (OuterVolumeSpecName: "scripts") pod "dc45efe6-4d81-459a-9338-606b52f920e1" (UID: "dc45efe6-4d81-459a-9338-606b52f920e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:25 crc kubenswrapper[4913]: I1001 13:32:25.971631 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc45efe6-4d81-459a-9338-606b52f920e1-kube-api-access-5xhth" (OuterVolumeSpecName: "kube-api-access-5xhth") pod "dc45efe6-4d81-459a-9338-606b52f920e1" (UID: "dc45efe6-4d81-459a-9338-606b52f920e1"). InnerVolumeSpecName "kube-api-access-5xhth". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.026260 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc45efe6-4d81-459a-9338-606b52f920e1" (UID: "dc45efe6-4d81-459a-9338-606b52f920e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.063259 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.063315 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dc45efe6-4d81-459a-9338-606b52f920e1-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.063328 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xhth\" (UniqueName: \"kubernetes.io/projected/dc45efe6-4d81-459a-9338-606b52f920e1-kube-api-access-5xhth\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.063344 4913 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.063356 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.067383 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-config-data" (OuterVolumeSpecName: "config-data") pod "dc45efe6-4d81-459a-9338-606b52f920e1" (UID: "dc45efe6-4d81-459a-9338-606b52f920e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.165182 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc45efe6-4d81-459a-9338-606b52f920e1-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.218568 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"dc45efe6-4d81-459a-9338-606b52f920e1","Type":"ContainerDied","Data":"3aff7431168fbed411f41f97d2c41b14979c314736f2e8a3836a74a57e9f63f2"} Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.218617 4913 scope.go:117] "RemoveContainer" containerID="bc731a3db1ea633e5408f439a877220a602a96ee846a928a4e0fa8a6c3cca3b0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.218689 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.247152 4913 scope.go:117] "RemoveContainer" containerID="532f266607864b4db09ec16acf12e1e5fbc60caaf2c7be918319da9cdde7ed13" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.260715 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.279749 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.298564 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 13:32:26 crc kubenswrapper[4913]: E1001 13:32:26.299380 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc45efe6-4d81-459a-9338-606b52f920e1" containerName="manila-share" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.299398 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc45efe6-4d81-459a-9338-606b52f920e1" containerName="manila-share" Oct 01 13:32:26 crc kubenswrapper[4913]: E1001 13:32:26.299413 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9d019d-bd66-4323-843b-e52e0efa0771" containerName="init" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.299421 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9d019d-bd66-4323-843b-e52e0efa0771" containerName="init" Oct 01 13:32:26 crc kubenswrapper[4913]: E1001 13:32:26.299436 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc45efe6-4d81-459a-9338-606b52f920e1" containerName="probe" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.299443 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc45efe6-4d81-459a-9338-606b52f920e1" containerName="probe" Oct 01 13:32:26 crc kubenswrapper[4913]: E1001 13:32:26.299475 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9d019d-bd66-4323-843b-e52e0efa0771" containerName="dnsmasq-dns" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.299482 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9d019d-bd66-4323-843b-e52e0efa0771" containerName="dnsmasq-dns" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.299659 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9d019d-bd66-4323-843b-e52e0efa0771" containerName="dnsmasq-dns" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.299681 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc45efe6-4d81-459a-9338-606b52f920e1" containerName="manila-share" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.299705 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc45efe6-4d81-459a-9338-606b52f920e1" containerName="probe" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.300716 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.302639 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.309672 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.370795 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa45eb41-34ff-42cb-97f8-71004a7e500f-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.370853 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl6tf\" (UniqueName: \"kubernetes.io/projected/fa45eb41-34ff-42cb-97f8-71004a7e500f-kube-api-access-dl6tf\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.370970 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fa45eb41-34ff-42cb-97f8-71004a7e500f-ceph\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.371005 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa45eb41-34ff-42cb-97f8-71004a7e500f-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.371135 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa45eb41-34ff-42cb-97f8-71004a7e500f-config-data\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.371196 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/fa45eb41-34ff-42cb-97f8-71004a7e500f-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.371259 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa45eb41-34ff-42cb-97f8-71004a7e500f-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.371597 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa45eb41-34ff-42cb-97f8-71004a7e500f-scripts\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.472710 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa45eb41-34ff-42cb-97f8-71004a7e500f-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.473029 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl6tf\" (UniqueName: \"kubernetes.io/projected/fa45eb41-34ff-42cb-97f8-71004a7e500f-kube-api-access-dl6tf\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.473150 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fa45eb41-34ff-42cb-97f8-71004a7e500f-ceph\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.473259 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa45eb41-34ff-42cb-97f8-71004a7e500f-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.472876 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa45eb41-34ff-42cb-97f8-71004a7e500f-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.473502 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa45eb41-34ff-42cb-97f8-71004a7e500f-config-data\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.473644 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/fa45eb41-34ff-42cb-97f8-71004a7e500f-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.473761 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa45eb41-34ff-42cb-97f8-71004a7e500f-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.473771 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/fa45eb41-34ff-42cb-97f8-71004a7e500f-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.474029 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa45eb41-34ff-42cb-97f8-71004a7e500f-scripts\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.477689 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa45eb41-34ff-42cb-97f8-71004a7e500f-scripts\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.477848 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa45eb41-34ff-42cb-97f8-71004a7e500f-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.478059 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa45eb41-34ff-42cb-97f8-71004a7e500f-config-data\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.478660 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fa45eb41-34ff-42cb-97f8-71004a7e500f-ceph\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.488076 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa45eb41-34ff-42cb-97f8-71004a7e500f-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.492439 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl6tf\" (UniqueName: \"kubernetes.io/projected/fa45eb41-34ff-42cb-97f8-71004a7e500f-kube-api-access-dl6tf\") pod \"manila-share-share1-0\" (UID: \"fa45eb41-34ff-42cb-97f8-71004a7e500f\") " pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.619072 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 01 13:32:26 crc kubenswrapper[4913]: I1001 13:32:26.821351 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc45efe6-4d81-459a-9338-606b52f920e1" path="/var/lib/kubelet/pods/dc45efe6-4d81-459a-9338-606b52f920e1/volumes" Oct 01 13:32:27 crc kubenswrapper[4913]: I1001 13:32:27.177204 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 13:32:27 crc kubenswrapper[4913]: I1001 13:32:27.249130 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"fa45eb41-34ff-42cb-97f8-71004a7e500f","Type":"ContainerStarted","Data":"759036df22a48f96b0a2b14f0581533e0b469f66144d643e00a88c6ad454eca6"} Oct 01 13:32:27 crc kubenswrapper[4913]: I1001 13:32:27.257928 4913 generic.go:334] "Generic (PLEG): container finished" podID="71c0701a-649f-46b9-9245-90d837dc9a28" containerID="e8050bee28a591da5b0e35e5768ed6965cb04b322f14d7f47c34af9770801e1d" exitCode=0 Oct 01 13:32:27 crc kubenswrapper[4913]: I1001 13:32:27.258002 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"71c0701a-649f-46b9-9245-90d837dc9a28","Type":"ContainerDied","Data":"e8050bee28a591da5b0e35e5768ed6965cb04b322f14d7f47c34af9770801e1d"} Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.281472 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"71c0701a-649f-46b9-9245-90d837dc9a28","Type":"ContainerDied","Data":"5e9c7e84fb1e14d36c9e5b6d064eaa50fc1d0b1a480d3196425811769eb5d9a9"} Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.282174 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e9c7e84fb1e14d36c9e5b6d064eaa50fc1d0b1a480d3196425811769eb5d9a9" Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.290869 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"fa45eb41-34ff-42cb-97f8-71004a7e500f","Type":"ContainerStarted","Data":"1a0d5f9ea87913e90cb9f7c741949aa6315e0ff11a7868d96f5746975c0630a0"} Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.351757 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.434982 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-config-data\") pod \"71c0701a-649f-46b9-9245-90d837dc9a28\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.435463 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-scripts\") pod \"71c0701a-649f-46b9-9245-90d837dc9a28\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.435495 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-config-data-custom\") pod \"71c0701a-649f-46b9-9245-90d837dc9a28\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.435648 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-combined-ca-bundle\") pod \"71c0701a-649f-46b9-9245-90d837dc9a28\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.435895 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cb5b\" (UniqueName: \"kubernetes.io/projected/71c0701a-649f-46b9-9245-90d837dc9a28-kube-api-access-6cb5b\") pod \"71c0701a-649f-46b9-9245-90d837dc9a28\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.436000 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71c0701a-649f-46b9-9245-90d837dc9a28-etc-machine-id\") pod \"71c0701a-649f-46b9-9245-90d837dc9a28\" (UID: \"71c0701a-649f-46b9-9245-90d837dc9a28\") " Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.436946 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71c0701a-649f-46b9-9245-90d837dc9a28-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "71c0701a-649f-46b9-9245-90d837dc9a28" (UID: "71c0701a-649f-46b9-9245-90d837dc9a28"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.439768 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "71c0701a-649f-46b9-9245-90d837dc9a28" (UID: "71c0701a-649f-46b9-9245-90d837dc9a28"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.441513 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-scripts" (OuterVolumeSpecName: "scripts") pod "71c0701a-649f-46b9-9245-90d837dc9a28" (UID: "71c0701a-649f-46b9-9245-90d837dc9a28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.443497 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c0701a-649f-46b9-9245-90d837dc9a28-kube-api-access-6cb5b" (OuterVolumeSpecName: "kube-api-access-6cb5b") pod "71c0701a-649f-46b9-9245-90d837dc9a28" (UID: "71c0701a-649f-46b9-9245-90d837dc9a28"). InnerVolumeSpecName "kube-api-access-6cb5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.497126 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71c0701a-649f-46b9-9245-90d837dc9a28" (UID: "71c0701a-649f-46b9-9245-90d837dc9a28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.539061 4913 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.539099 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.539111 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.539124 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cb5b\" (UniqueName: \"kubernetes.io/projected/71c0701a-649f-46b9-9245-90d837dc9a28-kube-api-access-6cb5b\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.539137 4913 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71c0701a-649f-46b9-9245-90d837dc9a28-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.547007 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-config-data" (OuterVolumeSpecName: "config-data") pod "71c0701a-649f-46b9-9245-90d837dc9a28" (UID: "71c0701a-649f-46b9-9245-90d837dc9a28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:28 crc kubenswrapper[4913]: I1001 13:32:28.641242 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c0701a-649f-46b9-9245-90d837dc9a28-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.104949 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.324325 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.324325 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"fa45eb41-34ff-42cb-97f8-71004a7e500f","Type":"ContainerStarted","Data":"5ec814e0eaa8892668420e6c0318e2dc08b905a8532d1cc3acb5386c440302c4"} Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.351492 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.351471133 podStartE2EDuration="3.351471133s" podCreationTimestamp="2025-10-01 13:32:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:32:29.345662448 +0000 UTC m=+3281.249138026" watchObservedRunningTime="2025-10-01 13:32:29.351471133 +0000 UTC m=+3281.254946711" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.369369 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.379994 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.390174 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 13:32:29 crc kubenswrapper[4913]: E1001 13:32:29.390626 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c0701a-649f-46b9-9245-90d837dc9a28" containerName="manila-scheduler" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.390642 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c0701a-649f-46b9-9245-90d837dc9a28" containerName="manila-scheduler" Oct 01 13:32:29 crc kubenswrapper[4913]: E1001 13:32:29.390658 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c0701a-649f-46b9-9245-90d837dc9a28" containerName="probe" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.390666 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c0701a-649f-46b9-9245-90d837dc9a28" containerName="probe" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.390834 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c0701a-649f-46b9-9245-90d837dc9a28" containerName="probe" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.390855 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c0701a-649f-46b9-9245-90d837dc9a28" containerName="manila-scheduler" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.391812 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.393806 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.399163 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.467526 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/350935fd-0db6-4b14-b035-60bd31f6ea57-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.467769 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/350935fd-0db6-4b14-b035-60bd31f6ea57-scripts\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.467808 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/350935fd-0db6-4b14-b035-60bd31f6ea57-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.467865 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350935fd-0db6-4b14-b035-60bd31f6ea57-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.467894 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350935fd-0db6-4b14-b035-60bd31f6ea57-config-data\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.467945 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cggpl\" (UniqueName: \"kubernetes.io/projected/350935fd-0db6-4b14-b035-60bd31f6ea57-kube-api-access-cggpl\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.477862 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6fjf6"] Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.481125 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.497235 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fjf6"] Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.569809 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/350935fd-0db6-4b14-b035-60bd31f6ea57-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.569860 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/350935fd-0db6-4b14-b035-60bd31f6ea57-scripts\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.569893 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwlb4\" (UniqueName: \"kubernetes.io/projected/0d880085-4d34-4cc6-be6b-0449065a2f16-kube-api-access-zwlb4\") pod \"redhat-marketplace-6fjf6\" (UID: \"0d880085-4d34-4cc6-be6b-0449065a2f16\") " pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.569945 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/350935fd-0db6-4b14-b035-60bd31f6ea57-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.570004 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350935fd-0db6-4b14-b035-60bd31f6ea57-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.570077 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d880085-4d34-4cc6-be6b-0449065a2f16-utilities\") pod \"redhat-marketplace-6fjf6\" (UID: \"0d880085-4d34-4cc6-be6b-0449065a2f16\") " pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.570113 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/350935fd-0db6-4b14-b035-60bd31f6ea57-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.570126 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350935fd-0db6-4b14-b035-60bd31f6ea57-config-data\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.570339 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cggpl\" (UniqueName: \"kubernetes.io/projected/350935fd-0db6-4b14-b035-60bd31f6ea57-kube-api-access-cggpl\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.570469 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d880085-4d34-4cc6-be6b-0449065a2f16-catalog-content\") pod \"redhat-marketplace-6fjf6\" (UID: \"0d880085-4d34-4cc6-be6b-0449065a2f16\") " pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.575783 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350935fd-0db6-4b14-b035-60bd31f6ea57-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.575932 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350935fd-0db6-4b14-b035-60bd31f6ea57-config-data\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.577679 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/350935fd-0db6-4b14-b035-60bd31f6ea57-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.584764 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/350935fd-0db6-4b14-b035-60bd31f6ea57-scripts\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.590766 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cggpl\" (UniqueName: \"kubernetes.io/projected/350935fd-0db6-4b14-b035-60bd31f6ea57-kube-api-access-cggpl\") pod \"manila-scheduler-0\" (UID: \"350935fd-0db6-4b14-b035-60bd31f6ea57\") " pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.672161 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d880085-4d34-4cc6-be6b-0449065a2f16-utilities\") pod \"redhat-marketplace-6fjf6\" (UID: \"0d880085-4d34-4cc6-be6b-0449065a2f16\") " pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.672335 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d880085-4d34-4cc6-be6b-0449065a2f16-catalog-content\") pod \"redhat-marketplace-6fjf6\" (UID: \"0d880085-4d34-4cc6-be6b-0449065a2f16\") " pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.672396 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwlb4\" (UniqueName: \"kubernetes.io/projected/0d880085-4d34-4cc6-be6b-0449065a2f16-kube-api-access-zwlb4\") pod \"redhat-marketplace-6fjf6\" (UID: \"0d880085-4d34-4cc6-be6b-0449065a2f16\") " pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.672690 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d880085-4d34-4cc6-be6b-0449065a2f16-utilities\") pod \"redhat-marketplace-6fjf6\" (UID: \"0d880085-4d34-4cc6-be6b-0449065a2f16\") " pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.673044 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d880085-4d34-4cc6-be6b-0449065a2f16-catalog-content\") pod \"redhat-marketplace-6fjf6\" (UID: \"0d880085-4d34-4cc6-be6b-0449065a2f16\") " pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.697189 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwlb4\" (UniqueName: \"kubernetes.io/projected/0d880085-4d34-4cc6-be6b-0449065a2f16-kube-api-access-zwlb4\") pod \"redhat-marketplace-6fjf6\" (UID: \"0d880085-4d34-4cc6-be6b-0449065a2f16\") " pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.712320 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 01 13:32:29 crc kubenswrapper[4913]: I1001 13:32:29.804357 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:30 crc kubenswrapper[4913]: I1001 13:32:30.172970 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 13:32:30 crc kubenswrapper[4913]: W1001 13:32:30.177045 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod350935fd_0db6_4b14_b035_60bd31f6ea57.slice/crio-cc7a4e56378df4c59fdc8f42ee6cd0af2de3975a3ff9836a224024692aeba57c WatchSource:0}: Error finding container cc7a4e56378df4c59fdc8f42ee6cd0af2de3975a3ff9836a224024692aeba57c: Status 404 returned error can't find the container with id cc7a4e56378df4c59fdc8f42ee6cd0af2de3975a3ff9836a224024692aeba57c Oct 01 13:32:30 crc kubenswrapper[4913]: I1001 13:32:30.288472 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fjf6"] Oct 01 13:32:30 crc kubenswrapper[4913]: I1001 13:32:30.352031 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"350935fd-0db6-4b14-b035-60bd31f6ea57","Type":"ContainerStarted","Data":"cc7a4e56378df4c59fdc8f42ee6cd0af2de3975a3ff9836a224024692aeba57c"} Oct 01 13:32:30 crc kubenswrapper[4913]: I1001 13:32:30.355387 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fjf6" event={"ID":"0d880085-4d34-4cc6-be6b-0449065a2f16","Type":"ContainerStarted","Data":"8365e200034dd10c0bc4d4304ca7069408e35113c8bc9b65e916501703301a22"} Oct 01 13:32:30 crc kubenswrapper[4913]: I1001 13:32:30.823507 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c0701a-649f-46b9-9245-90d837dc9a28" path="/var/lib/kubelet/pods/71c0701a-649f-46b9-9245-90d837dc9a28/volumes" Oct 01 13:32:31 crc kubenswrapper[4913]: I1001 13:32:31.365538 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"350935fd-0db6-4b14-b035-60bd31f6ea57","Type":"ContainerStarted","Data":"a1a423d1a2f22a92d42ccff7947dde12bd665547dc64012fea6591e31516f895"} Oct 01 13:32:31 crc kubenswrapper[4913]: I1001 13:32:31.365895 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"350935fd-0db6-4b14-b035-60bd31f6ea57","Type":"ContainerStarted","Data":"56e93741465b01aed27ea25e3cafc00408132d6829585c89612207de6f7a8e34"} Oct 01 13:32:31 crc kubenswrapper[4913]: I1001 13:32:31.370664 4913 generic.go:334] "Generic (PLEG): container finished" podID="0d880085-4d34-4cc6-be6b-0449065a2f16" containerID="1a199178f0fa9f4c9c3b9f3cdc60598900c339f3a8616f0e2d2d697f25757013" exitCode=0 Oct 01 13:32:31 crc kubenswrapper[4913]: I1001 13:32:31.370718 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fjf6" event={"ID":"0d880085-4d34-4cc6-be6b-0449065a2f16","Type":"ContainerDied","Data":"1a199178f0fa9f4c9c3b9f3cdc60598900c339f3a8616f0e2d2d697f25757013"} Oct 01 13:32:31 crc kubenswrapper[4913]: I1001 13:32:31.388976 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.388953182 podStartE2EDuration="2.388953182s" podCreationTimestamp="2025-10-01 13:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:32:31.38687938 +0000 UTC m=+3283.290354988" watchObservedRunningTime="2025-10-01 13:32:31.388953182 +0000 UTC m=+3283.292428780" Oct 01 13:32:34 crc kubenswrapper[4913]: I1001 13:32:34.440500 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fjf6" event={"ID":"0d880085-4d34-4cc6-be6b-0449065a2f16","Type":"ContainerStarted","Data":"da7654aa6b0a1f337e2de3c731249949a72511fdee7e8ff2dcff57af03093981"} Oct 01 13:32:35 crc kubenswrapper[4913]: I1001 13:32:35.454728 4913 generic.go:334] "Generic (PLEG): container finished" podID="0d880085-4d34-4cc6-be6b-0449065a2f16" containerID="da7654aa6b0a1f337e2de3c731249949a72511fdee7e8ff2dcff57af03093981" exitCode=0 Oct 01 13:32:35 crc kubenswrapper[4913]: I1001 13:32:35.454798 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fjf6" event={"ID":"0d880085-4d34-4cc6-be6b-0449065a2f16","Type":"ContainerDied","Data":"da7654aa6b0a1f337e2de3c731249949a72511fdee7e8ff2dcff57af03093981"} Oct 01 13:32:36 crc kubenswrapper[4913]: I1001 13:32:36.620528 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 01 13:32:37 crc kubenswrapper[4913]: I1001 13:32:37.475721 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fjf6" event={"ID":"0d880085-4d34-4cc6-be6b-0449065a2f16","Type":"ContainerStarted","Data":"d7ac79e85cab6245b355957365e9584f0f4b13b5284453065012c39dfeaab770"} Oct 01 13:32:37 crc kubenswrapper[4913]: I1001 13:32:37.493277 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6fjf6" podStartSLOduration=3.243573322 podStartE2EDuration="8.493247292s" podCreationTimestamp="2025-10-01 13:32:29 +0000 UTC" firstStartedPulling="2025-10-01 13:32:31.372484024 +0000 UTC m=+3283.275959602" lastFinishedPulling="2025-10-01 13:32:36.622157974 +0000 UTC m=+3288.525633572" observedRunningTime="2025-10-01 13:32:37.491738547 +0000 UTC m=+3289.395214145" watchObservedRunningTime="2025-10-01 13:32:37.493247292 +0000 UTC m=+3289.396722870" Oct 01 13:32:39 crc kubenswrapper[4913]: I1001 13:32:39.713682 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 01 13:32:39 crc kubenswrapper[4913]: I1001 13:32:39.806936 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:39 crc kubenswrapper[4913]: I1001 13:32:39.806970 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:39 crc kubenswrapper[4913]: I1001 13:32:39.859985 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:41 crc kubenswrapper[4913]: I1001 13:32:41.466496 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:32:41 crc kubenswrapper[4913]: I1001 13:32:41.475290 4913 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 01 13:32:48 crc kubenswrapper[4913]: I1001 13:32:48.213482 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 01 13:32:48 crc kubenswrapper[4913]: I1001 13:32:48.214715 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jzvmv"] Oct 01 13:32:48 crc kubenswrapper[4913]: I1001 13:32:48.216610 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:48 crc kubenswrapper[4913]: I1001 13:32:48.225573 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jzvmv"] Oct 01 13:32:48 crc kubenswrapper[4913]: I1001 13:32:48.237654 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0033ecf-0062-43de-af82-6a9cc80c7cab-utilities\") pod \"community-operators-jzvmv\" (UID: \"c0033ecf-0062-43de-af82-6a9cc80c7cab\") " pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:48 crc kubenswrapper[4913]: I1001 13:32:48.237697 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv69c\" (UniqueName: \"kubernetes.io/projected/c0033ecf-0062-43de-af82-6a9cc80c7cab-kube-api-access-jv69c\") pod \"community-operators-jzvmv\" (UID: \"c0033ecf-0062-43de-af82-6a9cc80c7cab\") " pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:48 crc kubenswrapper[4913]: I1001 13:32:48.237764 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0033ecf-0062-43de-af82-6a9cc80c7cab-catalog-content\") pod \"community-operators-jzvmv\" (UID: \"c0033ecf-0062-43de-af82-6a9cc80c7cab\") " pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:48 crc kubenswrapper[4913]: I1001 13:32:48.339852 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0033ecf-0062-43de-af82-6a9cc80c7cab-utilities\") pod \"community-operators-jzvmv\" (UID: \"c0033ecf-0062-43de-af82-6a9cc80c7cab\") " pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:48 crc kubenswrapper[4913]: I1001 13:32:48.339895 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv69c\" (UniqueName: \"kubernetes.io/projected/c0033ecf-0062-43de-af82-6a9cc80c7cab-kube-api-access-jv69c\") pod \"community-operators-jzvmv\" (UID: \"c0033ecf-0062-43de-af82-6a9cc80c7cab\") " pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:48 crc kubenswrapper[4913]: I1001 13:32:48.339965 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0033ecf-0062-43de-af82-6a9cc80c7cab-catalog-content\") pod \"community-operators-jzvmv\" (UID: \"c0033ecf-0062-43de-af82-6a9cc80c7cab\") " pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:48 crc kubenswrapper[4913]: I1001 13:32:48.340487 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0033ecf-0062-43de-af82-6a9cc80c7cab-utilities\") pod \"community-operators-jzvmv\" (UID: \"c0033ecf-0062-43de-af82-6a9cc80c7cab\") " pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:48 crc kubenswrapper[4913]: I1001 13:32:48.340766 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0033ecf-0062-43de-af82-6a9cc80c7cab-catalog-content\") pod \"community-operators-jzvmv\" (UID: \"c0033ecf-0062-43de-af82-6a9cc80c7cab\") " pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:48 crc kubenswrapper[4913]: I1001 13:32:48.364944 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv69c\" (UniqueName: \"kubernetes.io/projected/c0033ecf-0062-43de-af82-6a9cc80c7cab-kube-api-access-jv69c\") pod \"community-operators-jzvmv\" (UID: \"c0033ecf-0062-43de-af82-6a9cc80c7cab\") " pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:48 crc kubenswrapper[4913]: I1001 13:32:48.537987 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:49 crc kubenswrapper[4913]: I1001 13:32:49.149552 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jzvmv"] Oct 01 13:32:49 crc kubenswrapper[4913]: I1001 13:32:49.577811 4913 generic.go:334] "Generic (PLEG): container finished" podID="c0033ecf-0062-43de-af82-6a9cc80c7cab" containerID="d1b459728cdc75accb51f0b621a8e7e0e937b3a447bc6fda37ea369dbb5ca164" exitCode=0 Oct 01 13:32:49 crc kubenswrapper[4913]: I1001 13:32:49.577859 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzvmv" event={"ID":"c0033ecf-0062-43de-af82-6a9cc80c7cab","Type":"ContainerDied","Data":"d1b459728cdc75accb51f0b621a8e7e0e937b3a447bc6fda37ea369dbb5ca164"} Oct 01 13:32:49 crc kubenswrapper[4913]: I1001 13:32:49.578117 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzvmv" event={"ID":"c0033ecf-0062-43de-af82-6a9cc80c7cab","Type":"ContainerStarted","Data":"6b90a1807590948491e6d2e447ca1803aeccab46d6960c33ec9f6fc6eff2ce0a"} Oct 01 13:32:49 crc kubenswrapper[4913]: I1001 13:32:49.858520 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:51 crc kubenswrapper[4913]: I1001 13:32:51.346013 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 01 13:32:51 crc kubenswrapper[4913]: I1001 13:32:51.600313 4913 generic.go:334] "Generic (PLEG): container finished" podID="c0033ecf-0062-43de-af82-6a9cc80c7cab" containerID="b81af60d08d6fc1a470ff412edb5c93b01d9513e7daabe35ac8e9ff894b14588" exitCode=0 Oct 01 13:32:51 crc kubenswrapper[4913]: I1001 13:32:51.600377 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzvmv" event={"ID":"c0033ecf-0062-43de-af82-6a9cc80c7cab","Type":"ContainerDied","Data":"b81af60d08d6fc1a470ff412edb5c93b01d9513e7daabe35ac8e9ff894b14588"} Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.191168 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fjf6"] Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.191419 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6fjf6" podUID="0d880085-4d34-4cc6-be6b-0449065a2f16" containerName="registry-server" containerID="cri-o://d7ac79e85cab6245b355957365e9584f0f4b13b5284453065012c39dfeaab770" gracePeriod=2 Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.613156 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzvmv" event={"ID":"c0033ecf-0062-43de-af82-6a9cc80c7cab","Type":"ContainerStarted","Data":"efb0e2a630088126a4b473b9abd8b0f6ce3403267f180f42aa53fd32ce707cf0"} Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.620244 4913 generic.go:334] "Generic (PLEG): container finished" podID="0d880085-4d34-4cc6-be6b-0449065a2f16" containerID="d7ac79e85cab6245b355957365e9584f0f4b13b5284453065012c39dfeaab770" exitCode=0 Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.620320 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fjf6" event={"ID":"0d880085-4d34-4cc6-be6b-0449065a2f16","Type":"ContainerDied","Data":"d7ac79e85cab6245b355957365e9584f0f4b13b5284453065012c39dfeaab770"} Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.642003 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jzvmv" podStartSLOduration=1.862713845 podStartE2EDuration="4.641977825s" podCreationTimestamp="2025-10-01 13:32:48 +0000 UTC" firstStartedPulling="2025-10-01 13:32:49.581375044 +0000 UTC m=+3301.484850622" lastFinishedPulling="2025-10-01 13:32:52.360639024 +0000 UTC m=+3304.264114602" observedRunningTime="2025-10-01 13:32:52.631796217 +0000 UTC m=+3304.535271985" watchObservedRunningTime="2025-10-01 13:32:52.641977825 +0000 UTC m=+3304.545453403" Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.708661 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.747639 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d880085-4d34-4cc6-be6b-0449065a2f16-utilities\") pod \"0d880085-4d34-4cc6-be6b-0449065a2f16\" (UID: \"0d880085-4d34-4cc6-be6b-0449065a2f16\") " Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.748076 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwlb4\" (UniqueName: \"kubernetes.io/projected/0d880085-4d34-4cc6-be6b-0449065a2f16-kube-api-access-zwlb4\") pod \"0d880085-4d34-4cc6-be6b-0449065a2f16\" (UID: \"0d880085-4d34-4cc6-be6b-0449065a2f16\") " Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.748238 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d880085-4d34-4cc6-be6b-0449065a2f16-catalog-content\") pod \"0d880085-4d34-4cc6-be6b-0449065a2f16\" (UID: \"0d880085-4d34-4cc6-be6b-0449065a2f16\") " Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.750028 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d880085-4d34-4cc6-be6b-0449065a2f16-utilities" (OuterVolumeSpecName: "utilities") pod "0d880085-4d34-4cc6-be6b-0449065a2f16" (UID: "0d880085-4d34-4cc6-be6b-0449065a2f16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.756000 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d880085-4d34-4cc6-be6b-0449065a2f16-kube-api-access-zwlb4" (OuterVolumeSpecName: "kube-api-access-zwlb4") pod "0d880085-4d34-4cc6-be6b-0449065a2f16" (UID: "0d880085-4d34-4cc6-be6b-0449065a2f16"). InnerVolumeSpecName "kube-api-access-zwlb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.773123 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d880085-4d34-4cc6-be6b-0449065a2f16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d880085-4d34-4cc6-be6b-0449065a2f16" (UID: "0d880085-4d34-4cc6-be6b-0449065a2f16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.850307 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwlb4\" (UniqueName: \"kubernetes.io/projected/0d880085-4d34-4cc6-be6b-0449065a2f16-kube-api-access-zwlb4\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.850344 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d880085-4d34-4cc6-be6b-0449065a2f16-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:52 crc kubenswrapper[4913]: I1001 13:32:52.850355 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d880085-4d34-4cc6-be6b-0449065a2f16-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:53 crc kubenswrapper[4913]: I1001 13:32:53.630831 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fjf6" Oct 01 13:32:53 crc kubenswrapper[4913]: I1001 13:32:53.630860 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fjf6" event={"ID":"0d880085-4d34-4cc6-be6b-0449065a2f16","Type":"ContainerDied","Data":"8365e200034dd10c0bc4d4304ca7069408e35113c8bc9b65e916501703301a22"} Oct 01 13:32:53 crc kubenswrapper[4913]: I1001 13:32:53.630919 4913 scope.go:117] "RemoveContainer" containerID="d7ac79e85cab6245b355957365e9584f0f4b13b5284453065012c39dfeaab770" Oct 01 13:32:53 crc kubenswrapper[4913]: I1001 13:32:53.651771 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fjf6"] Oct 01 13:32:53 crc kubenswrapper[4913]: I1001 13:32:53.655291 4913 scope.go:117] "RemoveContainer" containerID="da7654aa6b0a1f337e2de3c731249949a72511fdee7e8ff2dcff57af03093981" Oct 01 13:32:53 crc kubenswrapper[4913]: I1001 13:32:53.661288 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fjf6"] Oct 01 13:32:53 crc kubenswrapper[4913]: I1001 13:32:53.676800 4913 scope.go:117] "RemoveContainer" containerID="1a199178f0fa9f4c9c3b9f3cdc60598900c339f3a8616f0e2d2d697f25757013" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.644539 4913 generic.go:334] "Generic (PLEG): container finished" podID="8fe5439a-1b24-4212-a088-0d0787144197" containerID="1fba055e1c7e2efd2496bdf0c8f7d258dc15d14aa6db184b4da26756db5f25ab" exitCode=137 Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.644640 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe5439a-1b24-4212-a088-0d0787144197","Type":"ContainerDied","Data":"1fba055e1c7e2efd2496bdf0c8f7d258dc15d14aa6db184b4da26756db5f25ab"} Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.727804 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.816218 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d880085-4d34-4cc6-be6b-0449065a2f16" path="/var/lib/kubelet/pods/0d880085-4d34-4cc6-be6b-0449065a2f16/volumes" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.859012 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-config-data\") pod \"8fe5439a-1b24-4212-a088-0d0787144197\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.859210 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe5439a-1b24-4212-a088-0d0787144197-log-httpd\") pod \"8fe5439a-1b24-4212-a088-0d0787144197\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.859236 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-ceilometer-tls-certs\") pod \"8fe5439a-1b24-4212-a088-0d0787144197\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.859296 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-scripts\") pod \"8fe5439a-1b24-4212-a088-0d0787144197\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.859329 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe5439a-1b24-4212-a088-0d0787144197-run-httpd\") pod \"8fe5439a-1b24-4212-a088-0d0787144197\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.859415 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-sg-core-conf-yaml\") pod \"8fe5439a-1b24-4212-a088-0d0787144197\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.859479 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-combined-ca-bundle\") pod \"8fe5439a-1b24-4212-a088-0d0787144197\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.859521 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gczjq\" (UniqueName: \"kubernetes.io/projected/8fe5439a-1b24-4212-a088-0d0787144197-kube-api-access-gczjq\") pod \"8fe5439a-1b24-4212-a088-0d0787144197\" (UID: \"8fe5439a-1b24-4212-a088-0d0787144197\") " Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.859863 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fe5439a-1b24-4212-a088-0d0787144197-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8fe5439a-1b24-4212-a088-0d0787144197" (UID: "8fe5439a-1b24-4212-a088-0d0787144197"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.860100 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fe5439a-1b24-4212-a088-0d0787144197-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8fe5439a-1b24-4212-a088-0d0787144197" (UID: "8fe5439a-1b24-4212-a088-0d0787144197"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.860549 4913 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe5439a-1b24-4212-a088-0d0787144197-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.864650 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fe5439a-1b24-4212-a088-0d0787144197-kube-api-access-gczjq" (OuterVolumeSpecName: "kube-api-access-gczjq") pod "8fe5439a-1b24-4212-a088-0d0787144197" (UID: "8fe5439a-1b24-4212-a088-0d0787144197"). InnerVolumeSpecName "kube-api-access-gczjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.864965 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-scripts" (OuterVolumeSpecName: "scripts") pod "8fe5439a-1b24-4212-a088-0d0787144197" (UID: "8fe5439a-1b24-4212-a088-0d0787144197"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.891842 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8fe5439a-1b24-4212-a088-0d0787144197" (UID: "8fe5439a-1b24-4212-a088-0d0787144197"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.909797 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8fe5439a-1b24-4212-a088-0d0787144197" (UID: "8fe5439a-1b24-4212-a088-0d0787144197"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.931913 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fe5439a-1b24-4212-a088-0d0787144197" (UID: "8fe5439a-1b24-4212-a088-0d0787144197"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.962881 4913 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.962915 4913 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.962926 4913 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe5439a-1b24-4212-a088-0d0787144197-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.962935 4913 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.962944 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.962954 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gczjq\" (UniqueName: \"kubernetes.io/projected/8fe5439a-1b24-4212-a088-0d0787144197-kube-api-access-gczjq\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:54 crc kubenswrapper[4913]: I1001 13:32:54.965304 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-config-data" (OuterVolumeSpecName: "config-data") pod "8fe5439a-1b24-4212-a088-0d0787144197" (UID: "8fe5439a-1b24-4212-a088-0d0787144197"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.064695 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe5439a-1b24-4212-a088-0d0787144197-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.663738 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe5439a-1b24-4212-a088-0d0787144197","Type":"ContainerDied","Data":"5d4c9adaa2be763bc8ffdedd5926b8f3c2f8d1dd586f702e24095457a7c86f65"} Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.663795 4913 scope.go:117] "RemoveContainer" containerID="1fba055e1c7e2efd2496bdf0c8f7d258dc15d14aa6db184b4da26756db5f25ab" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.663815 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.699257 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.706061 4913 scope.go:117] "RemoveContainer" containerID="5bfe22913f03a481a1e8b313b8b4cfb4c9ff796a0c30348fa46173f34068d8ba" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.709031 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.730415 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:32:55 crc kubenswrapper[4913]: E1001 13:32:55.731631 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d880085-4d34-4cc6-be6b-0449065a2f16" containerName="extract-utilities" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.731660 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d880085-4d34-4cc6-be6b-0449065a2f16" containerName="extract-utilities" Oct 01 13:32:55 crc kubenswrapper[4913]: E1001 13:32:55.731695 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="proxy-httpd" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.731702 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="proxy-httpd" Oct 01 13:32:55 crc kubenswrapper[4913]: E1001 13:32:55.731713 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d880085-4d34-4cc6-be6b-0449065a2f16" containerName="extract-content" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.731718 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d880085-4d34-4cc6-be6b-0449065a2f16" containerName="extract-content" Oct 01 13:32:55 crc kubenswrapper[4913]: E1001 13:32:55.731733 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="ceilometer-notification-agent" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.731739 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="ceilometer-notification-agent" Oct 01 13:32:55 crc kubenswrapper[4913]: E1001 13:32:55.731754 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d880085-4d34-4cc6-be6b-0449065a2f16" containerName="registry-server" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.731761 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d880085-4d34-4cc6-be6b-0449065a2f16" containerName="registry-server" Oct 01 13:32:55 crc kubenswrapper[4913]: E1001 13:32:55.731772 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="sg-core" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.731777 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="sg-core" Oct 01 13:32:55 crc kubenswrapper[4913]: E1001 13:32:55.731791 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="ceilometer-central-agent" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.731797 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="ceilometer-central-agent" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.731975 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="ceilometer-notification-agent" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.731985 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="sg-core" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.731999 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d880085-4d34-4cc6-be6b-0449065a2f16" containerName="registry-server" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.732016 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="ceilometer-central-agent" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.732029 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe5439a-1b24-4212-a088-0d0787144197" containerName="proxy-httpd" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.732541 4913 scope.go:117] "RemoveContainer" containerID="37caa8b073817a2940d2cab5d32115df22a476a6d20df351c0604d9d36ba7647" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.734322 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.737243 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.738122 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.738374 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.750315 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.769516 4913 scope.go:117] "RemoveContainer" containerID="149dc5c3d838d43d2dd65373e5769478f0121e6be648abb124b5110bf5c6c9ca" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.779006 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.779076 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brc55\" (UniqueName: \"kubernetes.io/projected/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-kube-api-access-brc55\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.779096 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.779134 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-scripts\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.779153 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.779230 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-run-httpd\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.779248 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-log-httpd\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.779276 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-config-data\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.881558 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.881873 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.881961 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brc55\" (UniqueName: \"kubernetes.io/projected/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-kube-api-access-brc55\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.882063 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-scripts\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.882140 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.882328 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-run-httpd\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.882414 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-log-httpd\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.882483 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-config-data\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.882875 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-run-httpd\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.883184 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-log-httpd\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.886864 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.886944 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.886984 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-scripts\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.891069 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-config-data\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.901921 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brc55\" (UniqueName: \"kubernetes.io/projected/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-kube-api-access-brc55\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:55 crc kubenswrapper[4913]: I1001 13:32:55.902061 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdfbdb5-1891-4b89-a07a-d468ac3c7155-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffdfbdb5-1891-4b89-a07a-d468ac3c7155\") " pod="openstack/ceilometer-0" Oct 01 13:32:56 crc kubenswrapper[4913]: I1001 13:32:56.063064 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:32:56 crc kubenswrapper[4913]: I1001 13:32:56.494960 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:32:56 crc kubenswrapper[4913]: W1001 13:32:56.501502 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffdfbdb5_1891_4b89_a07a_d468ac3c7155.slice/crio-3f21be5573a79dd28f4124877ddc5c4a27178c871197c1ac6602a5cde64d90a4 WatchSource:0}: Error finding container 3f21be5573a79dd28f4124877ddc5c4a27178c871197c1ac6602a5cde64d90a4: Status 404 returned error can't find the container with id 3f21be5573a79dd28f4124877ddc5c4a27178c871197c1ac6602a5cde64d90a4 Oct 01 13:32:56 crc kubenswrapper[4913]: I1001 13:32:56.676992 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffdfbdb5-1891-4b89-a07a-d468ac3c7155","Type":"ContainerStarted","Data":"3f21be5573a79dd28f4124877ddc5c4a27178c871197c1ac6602a5cde64d90a4"} Oct 01 13:32:56 crc kubenswrapper[4913]: I1001 13:32:56.819469 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fe5439a-1b24-4212-a088-0d0787144197" path="/var/lib/kubelet/pods/8fe5439a-1b24-4212-a088-0d0787144197/volumes" Oct 01 13:32:58 crc kubenswrapper[4913]: I1001 13:32:58.539211 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:58 crc kubenswrapper[4913]: I1001 13:32:58.540115 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:58 crc kubenswrapper[4913]: I1001 13:32:58.594219 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:58 crc kubenswrapper[4913]: I1001 13:32:58.708003 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffdfbdb5-1891-4b89-a07a-d468ac3c7155","Type":"ContainerStarted","Data":"3bb5f9003e398b718ae1fe6857ea3682b7a1aaf5ddabce185d2975e885ac2259"} Oct 01 13:32:58 crc kubenswrapper[4913]: I1001 13:32:58.767493 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:32:58 crc kubenswrapper[4913]: I1001 13:32:58.830564 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jzvmv"] Oct 01 13:32:59 crc kubenswrapper[4913]: I1001 13:32:59.718933 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffdfbdb5-1891-4b89-a07a-d468ac3c7155","Type":"ContainerStarted","Data":"2ba9492c84b2a358866c29425869fba34c0e1007307facc7e9b7410cd41ccbc6"} Oct 01 13:33:00 crc kubenswrapper[4913]: I1001 13:33:00.730760 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffdfbdb5-1891-4b89-a07a-d468ac3c7155","Type":"ContainerStarted","Data":"639ab16cc2e8afd7a56e1fc501adc88ecd3f5401e259393f70bb932990990230"} Oct 01 13:33:00 crc kubenswrapper[4913]: I1001 13:33:00.730912 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jzvmv" podUID="c0033ecf-0062-43de-af82-6a9cc80c7cab" containerName="registry-server" containerID="cri-o://efb0e2a630088126a4b473b9abd8b0f6ce3403267f180f42aa53fd32ce707cf0" gracePeriod=2 Oct 01 13:33:02 crc kubenswrapper[4913]: I1001 13:33:02.747701 4913 generic.go:334] "Generic (PLEG): container finished" podID="c0033ecf-0062-43de-af82-6a9cc80c7cab" containerID="efb0e2a630088126a4b473b9abd8b0f6ce3403267f180f42aa53fd32ce707cf0" exitCode=0 Oct 01 13:33:02 crc kubenswrapper[4913]: I1001 13:33:02.748392 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzvmv" event={"ID":"c0033ecf-0062-43de-af82-6a9cc80c7cab","Type":"ContainerDied","Data":"efb0e2a630088126a4b473b9abd8b0f6ce3403267f180f42aa53fd32ce707cf0"} Oct 01 13:33:02 crc kubenswrapper[4913]: I1001 13:33:02.856569 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:33:02 crc kubenswrapper[4913]: I1001 13:33:02.924929 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv69c\" (UniqueName: \"kubernetes.io/projected/c0033ecf-0062-43de-af82-6a9cc80c7cab-kube-api-access-jv69c\") pod \"c0033ecf-0062-43de-af82-6a9cc80c7cab\" (UID: \"c0033ecf-0062-43de-af82-6a9cc80c7cab\") " Oct 01 13:33:02 crc kubenswrapper[4913]: I1001 13:33:02.925091 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0033ecf-0062-43de-af82-6a9cc80c7cab-utilities\") pod \"c0033ecf-0062-43de-af82-6a9cc80c7cab\" (UID: \"c0033ecf-0062-43de-af82-6a9cc80c7cab\") " Oct 01 13:33:02 crc kubenswrapper[4913]: I1001 13:33:02.925145 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0033ecf-0062-43de-af82-6a9cc80c7cab-catalog-content\") pod \"c0033ecf-0062-43de-af82-6a9cc80c7cab\" (UID: \"c0033ecf-0062-43de-af82-6a9cc80c7cab\") " Oct 01 13:33:02 crc kubenswrapper[4913]: I1001 13:33:02.933509 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0033ecf-0062-43de-af82-6a9cc80c7cab-kube-api-access-jv69c" (OuterVolumeSpecName: "kube-api-access-jv69c") pod "c0033ecf-0062-43de-af82-6a9cc80c7cab" (UID: "c0033ecf-0062-43de-af82-6a9cc80c7cab"). InnerVolumeSpecName "kube-api-access-jv69c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:33:02 crc kubenswrapper[4913]: I1001 13:33:02.936019 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0033ecf-0062-43de-af82-6a9cc80c7cab-utilities" (OuterVolumeSpecName: "utilities") pod "c0033ecf-0062-43de-af82-6a9cc80c7cab" (UID: "c0033ecf-0062-43de-af82-6a9cc80c7cab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:33:02 crc kubenswrapper[4913]: I1001 13:33:02.984968 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0033ecf-0062-43de-af82-6a9cc80c7cab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0033ecf-0062-43de-af82-6a9cc80c7cab" (UID: "c0033ecf-0062-43de-af82-6a9cc80c7cab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:33:03 crc kubenswrapper[4913]: I1001 13:33:03.027899 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv69c\" (UniqueName: \"kubernetes.io/projected/c0033ecf-0062-43de-af82-6a9cc80c7cab-kube-api-access-jv69c\") on node \"crc\" DevicePath \"\"" Oct 01 13:33:03 crc kubenswrapper[4913]: I1001 13:33:03.027950 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0033ecf-0062-43de-af82-6a9cc80c7cab-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:33:03 crc kubenswrapper[4913]: I1001 13:33:03.027963 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0033ecf-0062-43de-af82-6a9cc80c7cab-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:33:03 crc kubenswrapper[4913]: I1001 13:33:03.760390 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzvmv" event={"ID":"c0033ecf-0062-43de-af82-6a9cc80c7cab","Type":"ContainerDied","Data":"6b90a1807590948491e6d2e447ca1803aeccab46d6960c33ec9f6fc6eff2ce0a"} Oct 01 13:33:03 crc kubenswrapper[4913]: I1001 13:33:03.760420 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzvmv" Oct 01 13:33:03 crc kubenswrapper[4913]: I1001 13:33:03.760804 4913 scope.go:117] "RemoveContainer" containerID="efb0e2a630088126a4b473b9abd8b0f6ce3403267f180f42aa53fd32ce707cf0" Oct 01 13:33:03 crc kubenswrapper[4913]: I1001 13:33:03.763806 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffdfbdb5-1891-4b89-a07a-d468ac3c7155","Type":"ContainerStarted","Data":"94b848dd49b11b34442300b8c8b3441c94bfe1e4231208c55f951409b4171aa3"} Oct 01 13:33:03 crc kubenswrapper[4913]: I1001 13:33:03.764020 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:33:03 crc kubenswrapper[4913]: I1001 13:33:03.796924 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.171629881 podStartE2EDuration="8.796900871s" podCreationTimestamp="2025-10-01 13:32:55 +0000 UTC" firstStartedPulling="2025-10-01 13:32:56.504051891 +0000 UTC m=+3308.407527489" lastFinishedPulling="2025-10-01 13:33:03.129322901 +0000 UTC m=+3315.032798479" observedRunningTime="2025-10-01 13:33:03.796499919 +0000 UTC m=+3315.699975517" watchObservedRunningTime="2025-10-01 13:33:03.796900871 +0000 UTC m=+3315.700376449" Oct 01 13:33:03 crc kubenswrapper[4913]: I1001 13:33:03.842853 4913 scope.go:117] "RemoveContainer" containerID="b81af60d08d6fc1a470ff412edb5c93b01d9513e7daabe35ac8e9ff894b14588" Oct 01 13:33:03 crc kubenswrapper[4913]: I1001 13:33:03.844602 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jzvmv"] Oct 01 13:33:03 crc kubenswrapper[4913]: I1001 13:33:03.855356 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jzvmv"] Oct 01 13:33:03 crc kubenswrapper[4913]: I1001 13:33:03.876183 4913 scope.go:117] "RemoveContainer" containerID="d1b459728cdc75accb51f0b621a8e7e0e937b3a447bc6fda37ea369dbb5ca164" Oct 01 13:33:04 crc kubenswrapper[4913]: I1001 13:33:04.818978 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0033ecf-0062-43de-af82-6a9cc80c7cab" path="/var/lib/kubelet/pods/c0033ecf-0062-43de-af82-6a9cc80c7cab/volumes" Oct 01 13:33:10 crc kubenswrapper[4913]: I1001 13:33:10.084218 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:33:10 crc kubenswrapper[4913]: I1001 13:33:10.085154 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:33:26 crc kubenswrapper[4913]: I1001 13:33:26.072949 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.324738 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z48z7"] Oct 01 13:33:31 crc kubenswrapper[4913]: E1001 13:33:31.325677 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0033ecf-0062-43de-af82-6a9cc80c7cab" containerName="extract-content" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.325692 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0033ecf-0062-43de-af82-6a9cc80c7cab" containerName="extract-content" Oct 01 13:33:31 crc kubenswrapper[4913]: E1001 13:33:31.325736 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0033ecf-0062-43de-af82-6a9cc80c7cab" containerName="registry-server" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.325742 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0033ecf-0062-43de-af82-6a9cc80c7cab" containerName="registry-server" Oct 01 13:33:31 crc kubenswrapper[4913]: E1001 13:33:31.325757 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0033ecf-0062-43de-af82-6a9cc80c7cab" containerName="extract-utilities" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.325763 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0033ecf-0062-43de-af82-6a9cc80c7cab" containerName="extract-utilities" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.325965 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0033ecf-0062-43de-af82-6a9cc80c7cab" containerName="registry-server" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.334774 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.354332 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z48z7"] Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.427618 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-utilities\") pod \"certified-operators-z48z7\" (UID: \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\") " pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.427734 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqlqc\" (UniqueName: \"kubernetes.io/projected/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-kube-api-access-cqlqc\") pod \"certified-operators-z48z7\" (UID: \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\") " pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.428010 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-catalog-content\") pod \"certified-operators-z48z7\" (UID: \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\") " pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.529958 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqlqc\" (UniqueName: \"kubernetes.io/projected/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-kube-api-access-cqlqc\") pod \"certified-operators-z48z7\" (UID: \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\") " pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.530144 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-catalog-content\") pod \"certified-operators-z48z7\" (UID: \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\") " pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.530188 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-utilities\") pod \"certified-operators-z48z7\" (UID: \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\") " pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.530821 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-utilities\") pod \"certified-operators-z48z7\" (UID: \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\") " pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.530834 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-catalog-content\") pod \"certified-operators-z48z7\" (UID: \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\") " pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.554287 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqlqc\" (UniqueName: \"kubernetes.io/projected/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-kube-api-access-cqlqc\") pod \"certified-operators-z48z7\" (UID: \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\") " pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:31 crc kubenswrapper[4913]: I1001 13:33:31.660352 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:32 crc kubenswrapper[4913]: W1001 13:33:32.187705 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf392b16d_f0e2_4c0c_be69_3a6ee1c47cab.slice/crio-c9f6e0734735a56404f1659d4448086071c5d6e34304bad09d6e0bd4c4f1b06e WatchSource:0}: Error finding container c9f6e0734735a56404f1659d4448086071c5d6e34304bad09d6e0bd4c4f1b06e: Status 404 returned error can't find the container with id c9f6e0734735a56404f1659d4448086071c5d6e34304bad09d6e0bd4c4f1b06e Oct 01 13:33:32 crc kubenswrapper[4913]: I1001 13:33:32.190113 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z48z7"] Oct 01 13:33:33 crc kubenswrapper[4913]: I1001 13:33:33.016620 4913 generic.go:334] "Generic (PLEG): container finished" podID="f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" containerID="26167e056d709fd02f17933bb0b36b065bae00969d916e755702d2c880d2ddab" exitCode=0 Oct 01 13:33:33 crc kubenswrapper[4913]: I1001 13:33:33.016663 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z48z7" event={"ID":"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab","Type":"ContainerDied","Data":"26167e056d709fd02f17933bb0b36b065bae00969d916e755702d2c880d2ddab"} Oct 01 13:33:33 crc kubenswrapper[4913]: I1001 13:33:33.016893 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z48z7" event={"ID":"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab","Type":"ContainerStarted","Data":"c9f6e0734735a56404f1659d4448086071c5d6e34304bad09d6e0bd4c4f1b06e"} Oct 01 13:33:35 crc kubenswrapper[4913]: I1001 13:33:35.035075 4913 generic.go:334] "Generic (PLEG): container finished" podID="f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" containerID="97923bd5f4f90a6061ade91b151ed9f6d1fde491c210e4ce0d72abc2781e1582" exitCode=0 Oct 01 13:33:35 crc kubenswrapper[4913]: I1001 13:33:35.035196 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z48z7" event={"ID":"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab","Type":"ContainerDied","Data":"97923bd5f4f90a6061ade91b151ed9f6d1fde491c210e4ce0d72abc2781e1582"} Oct 01 13:33:36 crc kubenswrapper[4913]: I1001 13:33:36.050504 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z48z7" event={"ID":"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab","Type":"ContainerStarted","Data":"4900bfd4a0eba428700bf31e32c947e757c3ce2e720d6cafe7e26ff8be76ebe1"} Oct 01 13:33:36 crc kubenswrapper[4913]: I1001 13:33:36.075293 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z48z7" podStartSLOduration=2.551714529 podStartE2EDuration="5.075252564s" podCreationTimestamp="2025-10-01 13:33:31 +0000 UTC" firstStartedPulling="2025-10-01 13:33:33.018392855 +0000 UTC m=+3344.921868443" lastFinishedPulling="2025-10-01 13:33:35.5419309 +0000 UTC m=+3347.445406478" observedRunningTime="2025-10-01 13:33:36.067959013 +0000 UTC m=+3347.971434611" watchObservedRunningTime="2025-10-01 13:33:36.075252564 +0000 UTC m=+3347.978728142" Oct 01 13:33:40 crc kubenswrapper[4913]: I1001 13:33:40.083635 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:33:40 crc kubenswrapper[4913]: I1001 13:33:40.084308 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:33:41 crc kubenswrapper[4913]: I1001 13:33:41.661392 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:41 crc kubenswrapper[4913]: I1001 13:33:41.661680 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:41 crc kubenswrapper[4913]: I1001 13:33:41.718869 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:42 crc kubenswrapper[4913]: I1001 13:33:42.158533 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:42 crc kubenswrapper[4913]: I1001 13:33:42.216237 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z48z7"] Oct 01 13:33:44 crc kubenswrapper[4913]: I1001 13:33:44.127529 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z48z7" podUID="f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" containerName="registry-server" containerID="cri-o://4900bfd4a0eba428700bf31e32c947e757c3ce2e720d6cafe7e26ff8be76ebe1" gracePeriod=2 Oct 01 13:33:44 crc kubenswrapper[4913]: I1001 13:33:44.642658 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:44 crc kubenswrapper[4913]: I1001 13:33:44.729590 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqlqc\" (UniqueName: \"kubernetes.io/projected/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-kube-api-access-cqlqc\") pod \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\" (UID: \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\") " Oct 01 13:33:44 crc kubenswrapper[4913]: I1001 13:33:44.729798 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-utilities\") pod \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\" (UID: \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\") " Oct 01 13:33:44 crc kubenswrapper[4913]: I1001 13:33:44.729824 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-catalog-content\") pod \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\" (UID: \"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab\") " Oct 01 13:33:44 crc kubenswrapper[4913]: I1001 13:33:44.730681 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-utilities" (OuterVolumeSpecName: "utilities") pod "f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" (UID: "f392b16d-f0e2-4c0c-be69-3a6ee1c47cab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:33:44 crc kubenswrapper[4913]: I1001 13:33:44.743787 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-kube-api-access-cqlqc" (OuterVolumeSpecName: "kube-api-access-cqlqc") pod "f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" (UID: "f392b16d-f0e2-4c0c-be69-3a6ee1c47cab"). InnerVolumeSpecName "kube-api-access-cqlqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:33:44 crc kubenswrapper[4913]: I1001 13:33:44.831998 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqlqc\" (UniqueName: \"kubernetes.io/projected/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-kube-api-access-cqlqc\") on node \"crc\" DevicePath \"\"" Oct 01 13:33:44 crc kubenswrapper[4913]: I1001 13:33:44.832045 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.137539 4913 generic.go:334] "Generic (PLEG): container finished" podID="f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" containerID="4900bfd4a0eba428700bf31e32c947e757c3ce2e720d6cafe7e26ff8be76ebe1" exitCode=0 Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.137606 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z48z7" Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.137619 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z48z7" event={"ID":"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab","Type":"ContainerDied","Data":"4900bfd4a0eba428700bf31e32c947e757c3ce2e720d6cafe7e26ff8be76ebe1"} Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.138196 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z48z7" event={"ID":"f392b16d-f0e2-4c0c-be69-3a6ee1c47cab","Type":"ContainerDied","Data":"c9f6e0734735a56404f1659d4448086071c5d6e34304bad09d6e0bd4c4f1b06e"} Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.138220 4913 scope.go:117] "RemoveContainer" containerID="4900bfd4a0eba428700bf31e32c947e757c3ce2e720d6cafe7e26ff8be76ebe1" Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.156473 4913 scope.go:117] "RemoveContainer" containerID="97923bd5f4f90a6061ade91b151ed9f6d1fde491c210e4ce0d72abc2781e1582" Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.177888 4913 scope.go:117] "RemoveContainer" containerID="26167e056d709fd02f17933bb0b36b065bae00969d916e755702d2c880d2ddab" Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.235106 4913 scope.go:117] "RemoveContainer" containerID="4900bfd4a0eba428700bf31e32c947e757c3ce2e720d6cafe7e26ff8be76ebe1" Oct 01 13:33:45 crc kubenswrapper[4913]: E1001 13:33:45.235496 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4900bfd4a0eba428700bf31e32c947e757c3ce2e720d6cafe7e26ff8be76ebe1\": container with ID starting with 4900bfd4a0eba428700bf31e32c947e757c3ce2e720d6cafe7e26ff8be76ebe1 not found: ID does not exist" containerID="4900bfd4a0eba428700bf31e32c947e757c3ce2e720d6cafe7e26ff8be76ebe1" Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.235533 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4900bfd4a0eba428700bf31e32c947e757c3ce2e720d6cafe7e26ff8be76ebe1"} err="failed to get container status \"4900bfd4a0eba428700bf31e32c947e757c3ce2e720d6cafe7e26ff8be76ebe1\": rpc error: code = NotFound desc = could not find container \"4900bfd4a0eba428700bf31e32c947e757c3ce2e720d6cafe7e26ff8be76ebe1\": container with ID starting with 4900bfd4a0eba428700bf31e32c947e757c3ce2e720d6cafe7e26ff8be76ebe1 not found: ID does not exist" Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.235556 4913 scope.go:117] "RemoveContainer" containerID="97923bd5f4f90a6061ade91b151ed9f6d1fde491c210e4ce0d72abc2781e1582" Oct 01 13:33:45 crc kubenswrapper[4913]: E1001 13:33:45.235916 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97923bd5f4f90a6061ade91b151ed9f6d1fde491c210e4ce0d72abc2781e1582\": container with ID starting with 97923bd5f4f90a6061ade91b151ed9f6d1fde491c210e4ce0d72abc2781e1582 not found: ID does not exist" containerID="97923bd5f4f90a6061ade91b151ed9f6d1fde491c210e4ce0d72abc2781e1582" Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.235942 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97923bd5f4f90a6061ade91b151ed9f6d1fde491c210e4ce0d72abc2781e1582"} err="failed to get container status \"97923bd5f4f90a6061ade91b151ed9f6d1fde491c210e4ce0d72abc2781e1582\": rpc error: code = NotFound desc = could not find container \"97923bd5f4f90a6061ade91b151ed9f6d1fde491c210e4ce0d72abc2781e1582\": container with ID starting with 97923bd5f4f90a6061ade91b151ed9f6d1fde491c210e4ce0d72abc2781e1582 not found: ID does not exist" Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.235956 4913 scope.go:117] "RemoveContainer" containerID="26167e056d709fd02f17933bb0b36b065bae00969d916e755702d2c880d2ddab" Oct 01 13:33:45 crc kubenswrapper[4913]: E1001 13:33:45.236296 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26167e056d709fd02f17933bb0b36b065bae00969d916e755702d2c880d2ddab\": container with ID starting with 26167e056d709fd02f17933bb0b36b065bae00969d916e755702d2c880d2ddab not found: ID does not exist" containerID="26167e056d709fd02f17933bb0b36b065bae00969d916e755702d2c880d2ddab" Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.236348 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26167e056d709fd02f17933bb0b36b065bae00969d916e755702d2c880d2ddab"} err="failed to get container status \"26167e056d709fd02f17933bb0b36b065bae00969d916e755702d2c880d2ddab\": rpc error: code = NotFound desc = could not find container \"26167e056d709fd02f17933bb0b36b065bae00969d916e755702d2c880d2ddab\": container with ID starting with 26167e056d709fd02f17933bb0b36b065bae00969d916e755702d2c880d2ddab not found: ID does not exist" Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.558442 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" (UID: "f392b16d-f0e2-4c0c-be69-3a6ee1c47cab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.647283 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.774943 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z48z7"] Oct 01 13:33:45 crc kubenswrapper[4913]: I1001 13:33:45.782149 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z48z7"] Oct 01 13:33:46 crc kubenswrapper[4913]: I1001 13:33:46.818006 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" path="/var/lib/kubelet/pods/f392b16d-f0e2-4c0c-be69-3a6ee1c47cab/volumes" Oct 01 13:34:10 crc kubenswrapper[4913]: I1001 13:34:10.083551 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:34:10 crc kubenswrapper[4913]: I1001 13:34:10.084135 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:34:10 crc kubenswrapper[4913]: I1001 13:34:10.084258 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 13:34:10 crc kubenswrapper[4913]: I1001 13:34:10.085047 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c6822e12a17aae1690bae98ddc4a0974be6566860672dde11dc55972de0232d9"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:34:10 crc kubenswrapper[4913]: I1001 13:34:10.085116 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://c6822e12a17aae1690bae98ddc4a0974be6566860672dde11dc55972de0232d9" gracePeriod=600 Oct 01 13:34:10 crc kubenswrapper[4913]: I1001 13:34:10.374111 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="c6822e12a17aae1690bae98ddc4a0974be6566860672dde11dc55972de0232d9" exitCode=0 Oct 01 13:34:10 crc kubenswrapper[4913]: I1001 13:34:10.374180 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"c6822e12a17aae1690bae98ddc4a0974be6566860672dde11dc55972de0232d9"} Oct 01 13:34:10 crc kubenswrapper[4913]: I1001 13:34:10.374467 4913 scope.go:117] "RemoveContainer" containerID="50e032999d90374ec7de5ce2310b7665b46c6eea2afc7996cf659fcdbea0b7e6" Oct 01 13:34:11 crc kubenswrapper[4913]: I1001 13:34:11.389940 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc"} Oct 01 13:34:12 crc kubenswrapper[4913]: I1001 13:34:12.618882 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6477d86654-p7vhq"] Oct 01 13:34:12 crc kubenswrapper[4913]: E1001 13:34:12.619749 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" containerName="registry-server" Oct 01 13:34:12 crc kubenswrapper[4913]: I1001 13:34:12.619761 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" containerName="registry-server" Oct 01 13:34:12 crc kubenswrapper[4913]: E1001 13:34:12.619772 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" containerName="extract-utilities" Oct 01 13:34:12 crc kubenswrapper[4913]: I1001 13:34:12.619779 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" containerName="extract-utilities" Oct 01 13:34:12 crc kubenswrapper[4913]: E1001 13:34:12.619794 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" containerName="extract-content" Oct 01 13:34:12 crc kubenswrapper[4913]: I1001 13:34:12.619800 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" containerName="extract-content" Oct 01 13:34:12 crc kubenswrapper[4913]: I1001 13:34:12.620015 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="f392b16d-f0e2-4c0c-be69-3a6ee1c47cab" containerName="registry-server" Oct 01 13:34:12 crc kubenswrapper[4913]: I1001 13:34:12.621452 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6477d86654-p7vhq" Oct 01 13:34:12 crc kubenswrapper[4913]: I1001 13:34:12.636168 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6477d86654-p7vhq"] Oct 01 13:34:12 crc kubenswrapper[4913]: I1001 13:34:12.679002 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvqqk\" (UniqueName: \"kubernetes.io/projected/95583d77-cff7-452a-85eb-0674b64df62a-kube-api-access-jvqqk\") pod \"openstack-operator-controller-operator-6477d86654-p7vhq\" (UID: \"95583d77-cff7-452a-85eb-0674b64df62a\") " pod="openstack-operators/openstack-operator-controller-operator-6477d86654-p7vhq" Oct 01 13:34:12 crc kubenswrapper[4913]: I1001 13:34:12.780439 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvqqk\" (UniqueName: \"kubernetes.io/projected/95583d77-cff7-452a-85eb-0674b64df62a-kube-api-access-jvqqk\") pod \"openstack-operator-controller-operator-6477d86654-p7vhq\" (UID: \"95583d77-cff7-452a-85eb-0674b64df62a\") " pod="openstack-operators/openstack-operator-controller-operator-6477d86654-p7vhq" Oct 01 13:34:12 crc kubenswrapper[4913]: I1001 13:34:12.799728 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvqqk\" (UniqueName: \"kubernetes.io/projected/95583d77-cff7-452a-85eb-0674b64df62a-kube-api-access-jvqqk\") pod \"openstack-operator-controller-operator-6477d86654-p7vhq\" (UID: \"95583d77-cff7-452a-85eb-0674b64df62a\") " pod="openstack-operators/openstack-operator-controller-operator-6477d86654-p7vhq" Oct 01 13:34:12 crc kubenswrapper[4913]: I1001 13:34:12.962567 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6477d86654-p7vhq" Oct 01 13:34:13 crc kubenswrapper[4913]: I1001 13:34:13.570654 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6477d86654-p7vhq"] Oct 01 13:34:14 crc kubenswrapper[4913]: I1001 13:34:14.417776 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6477d86654-p7vhq" event={"ID":"95583d77-cff7-452a-85eb-0674b64df62a","Type":"ContainerStarted","Data":"4ac69529d682c3132281ec065244680725ec220e41c7ce76fac1c8fddd82777c"} Oct 01 13:34:14 crc kubenswrapper[4913]: I1001 13:34:14.418100 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6477d86654-p7vhq" event={"ID":"95583d77-cff7-452a-85eb-0674b64df62a","Type":"ContainerStarted","Data":"d8874ee5087582cd67643d77f13f7e7ba27decc01977f2655f123b92c78f864b"} Oct 01 13:34:14 crc kubenswrapper[4913]: I1001 13:34:14.418112 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6477d86654-p7vhq" event={"ID":"95583d77-cff7-452a-85eb-0674b64df62a","Type":"ContainerStarted","Data":"375a1d8c05d833ed09e47bd556d446526498baa77dfa072e744c57e55027bbfa"} Oct 01 13:34:14 crc kubenswrapper[4913]: I1001 13:34:14.418491 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6477d86654-p7vhq" Oct 01 13:34:14 crc kubenswrapper[4913]: I1001 13:34:14.447437 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6477d86654-p7vhq" podStartSLOduration=2.447419327 podStartE2EDuration="2.447419327s" podCreationTimestamp="2025-10-01 13:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:34:14.445184394 +0000 UTC m=+3386.348660012" watchObservedRunningTime="2025-10-01 13:34:14.447419327 +0000 UTC m=+3386.350894905" Oct 01 13:34:22 crc kubenswrapper[4913]: I1001 13:34:22.965461 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6477d86654-p7vhq" Oct 01 13:34:23 crc kubenswrapper[4913]: I1001 13:34:23.058476 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57"] Oct 01 13:34:23 crc kubenswrapper[4913]: I1001 13:34:23.059008 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" podUID="d54b747c-e08e-4b5c-b0ea-74651018cc09" containerName="operator" containerID="cri-o://ba0649fae8ccbf4a91a923579d5ba385abe3cff70aa185780748226971ea2f51" gracePeriod=10 Oct 01 13:34:23 crc kubenswrapper[4913]: I1001 13:34:23.059095 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" podUID="d54b747c-e08e-4b5c-b0ea-74651018cc09" containerName="kube-rbac-proxy" containerID="cri-o://f245fb80f640141d9ea93f59166cf3908fd0e33bd29a5d1508d345c1fb6647e9" gracePeriod=10 Oct 01 13:34:23 crc kubenswrapper[4913]: I1001 13:34:23.491556 4913 generic.go:334] "Generic (PLEG): container finished" podID="d54b747c-e08e-4b5c-b0ea-74651018cc09" containerID="f245fb80f640141d9ea93f59166cf3908fd0e33bd29a5d1508d345c1fb6647e9" exitCode=0 Oct 01 13:34:23 crc kubenswrapper[4913]: I1001 13:34:23.492631 4913 generic.go:334] "Generic (PLEG): container finished" podID="d54b747c-e08e-4b5c-b0ea-74651018cc09" containerID="ba0649fae8ccbf4a91a923579d5ba385abe3cff70aa185780748226971ea2f51" exitCode=0 Oct 01 13:34:23 crc kubenswrapper[4913]: I1001 13:34:23.491624 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" event={"ID":"d54b747c-e08e-4b5c-b0ea-74651018cc09","Type":"ContainerDied","Data":"f245fb80f640141d9ea93f59166cf3908fd0e33bd29a5d1508d345c1fb6647e9"} Oct 01 13:34:23 crc kubenswrapper[4913]: I1001 13:34:23.492690 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" event={"ID":"d54b747c-e08e-4b5c-b0ea-74651018cc09","Type":"ContainerDied","Data":"ba0649fae8ccbf4a91a923579d5ba385abe3cff70aa185780748226971ea2f51"} Oct 01 13:34:23 crc kubenswrapper[4913]: I1001 13:34:23.492711 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" event={"ID":"d54b747c-e08e-4b5c-b0ea-74651018cc09","Type":"ContainerDied","Data":"d6259bf80a40757815bfe27206adbbee49d52fb33b5b5d5a0cc1e1372f4cfd4f"} Oct 01 13:34:23 crc kubenswrapper[4913]: I1001 13:34:23.492725 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6259bf80a40757815bfe27206adbbee49d52fb33b5b5d5a0cc1e1372f4cfd4f" Oct 01 13:34:23 crc kubenswrapper[4913]: I1001 13:34:23.527855 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" Oct 01 13:34:23 crc kubenswrapper[4913]: I1001 13:34:23.605958 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjj4h\" (UniqueName: \"kubernetes.io/projected/d54b747c-e08e-4b5c-b0ea-74651018cc09-kube-api-access-fjj4h\") pod \"d54b747c-e08e-4b5c-b0ea-74651018cc09\" (UID: \"d54b747c-e08e-4b5c-b0ea-74651018cc09\") " Oct 01 13:34:23 crc kubenswrapper[4913]: I1001 13:34:23.616443 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54b747c-e08e-4b5c-b0ea-74651018cc09-kube-api-access-fjj4h" (OuterVolumeSpecName: "kube-api-access-fjj4h") pod "d54b747c-e08e-4b5c-b0ea-74651018cc09" (UID: "d54b747c-e08e-4b5c-b0ea-74651018cc09"). InnerVolumeSpecName "kube-api-access-fjj4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:34:23 crc kubenswrapper[4913]: I1001 13:34:23.708086 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjj4h\" (UniqueName: \"kubernetes.io/projected/d54b747c-e08e-4b5c-b0ea-74651018cc09-kube-api-access-fjj4h\") on node \"crc\" DevicePath \"\"" Oct 01 13:34:24 crc kubenswrapper[4913]: I1001 13:34:24.500303 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57" Oct 01 13:34:24 crc kubenswrapper[4913]: I1001 13:34:24.538780 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57"] Oct 01 13:34:24 crc kubenswrapper[4913]: I1001 13:34:24.552159 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-676c66f88b-7nb57"] Oct 01 13:34:24 crc kubenswrapper[4913]: I1001 13:34:24.817219 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d54b747c-e08e-4b5c-b0ea-74651018cc09" path="/var/lib/kubelet/pods/d54b747c-e08e-4b5c-b0ea-74651018cc09/volumes" Oct 01 13:34:58 crc kubenswrapper[4913]: I1001 13:34:58.257792 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5846bf4994-z259j"] Oct 01 13:34:58 crc kubenswrapper[4913]: E1001 13:34:58.258853 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54b747c-e08e-4b5c-b0ea-74651018cc09" containerName="kube-rbac-proxy" Oct 01 13:34:58 crc kubenswrapper[4913]: I1001 13:34:58.258869 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54b747c-e08e-4b5c-b0ea-74651018cc09" containerName="kube-rbac-proxy" Oct 01 13:34:58 crc kubenswrapper[4913]: E1001 13:34:58.258887 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54b747c-e08e-4b5c-b0ea-74651018cc09" containerName="operator" Oct 01 13:34:58 crc kubenswrapper[4913]: I1001 13:34:58.258895 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54b747c-e08e-4b5c-b0ea-74651018cc09" containerName="operator" Oct 01 13:34:58 crc kubenswrapper[4913]: I1001 13:34:58.259109 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54b747c-e08e-4b5c-b0ea-74651018cc09" containerName="kube-rbac-proxy" Oct 01 13:34:58 crc kubenswrapper[4913]: I1001 13:34:58.259125 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54b747c-e08e-4b5c-b0ea-74651018cc09" containerName="operator" Oct 01 13:34:58 crc kubenswrapper[4913]: I1001 13:34:58.260636 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5846bf4994-z259j" Oct 01 13:34:58 crc kubenswrapper[4913]: I1001 13:34:58.289042 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5846bf4994-z259j"] Oct 01 13:34:58 crc kubenswrapper[4913]: I1001 13:34:58.402546 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btqnl\" (UniqueName: \"kubernetes.io/projected/fd53fc8f-4ea6-4bd8-9805-af0f7e442bc2-kube-api-access-btqnl\") pod \"test-operator-controller-manager-5846bf4994-z259j\" (UID: \"fd53fc8f-4ea6-4bd8-9805-af0f7e442bc2\") " pod="openstack-operators/test-operator-controller-manager-5846bf4994-z259j" Oct 01 13:34:58 crc kubenswrapper[4913]: I1001 13:34:58.504785 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btqnl\" (UniqueName: \"kubernetes.io/projected/fd53fc8f-4ea6-4bd8-9805-af0f7e442bc2-kube-api-access-btqnl\") pod \"test-operator-controller-manager-5846bf4994-z259j\" (UID: \"fd53fc8f-4ea6-4bd8-9805-af0f7e442bc2\") " pod="openstack-operators/test-operator-controller-manager-5846bf4994-z259j" Oct 01 13:34:58 crc kubenswrapper[4913]: I1001 13:34:58.525769 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btqnl\" (UniqueName: \"kubernetes.io/projected/fd53fc8f-4ea6-4bd8-9805-af0f7e442bc2-kube-api-access-btqnl\") pod \"test-operator-controller-manager-5846bf4994-z259j\" (UID: \"fd53fc8f-4ea6-4bd8-9805-af0f7e442bc2\") " pod="openstack-operators/test-operator-controller-manager-5846bf4994-z259j" Oct 01 13:34:58 crc kubenswrapper[4913]: I1001 13:34:58.591244 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5846bf4994-z259j" Oct 01 13:34:59 crc kubenswrapper[4913]: I1001 13:34:59.027959 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5846bf4994-z259j"] Oct 01 13:34:59 crc kubenswrapper[4913]: I1001 13:34:59.808193 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5846bf4994-z259j" event={"ID":"fd53fc8f-4ea6-4bd8-9805-af0f7e442bc2","Type":"ContainerStarted","Data":"86dbef63e8ddf17afc6b370b5b8f2bbffc58998ab67f9a0703a231cdacdcce06"} Oct 01 13:35:01 crc kubenswrapper[4913]: I1001 13:35:01.833783 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5846bf4994-z259j" event={"ID":"fd53fc8f-4ea6-4bd8-9805-af0f7e442bc2","Type":"ContainerStarted","Data":"89ad1ee9daf70bf4b2507eca9fb6db17c676bce36ffaede8768ea682b1978cf5"} Oct 01 13:35:01 crc kubenswrapper[4913]: I1001 13:35:01.835339 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5846bf4994-z259j" event={"ID":"fd53fc8f-4ea6-4bd8-9805-af0f7e442bc2","Type":"ContainerStarted","Data":"81eaf3a9c42b543056757b9e4bafd1850fd042375a780dfa7543ecc86c5a3555"} Oct 01 13:35:01 crc kubenswrapper[4913]: I1001 13:35:01.835367 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5846bf4994-z259j" Oct 01 13:35:01 crc kubenswrapper[4913]: I1001 13:35:01.852203 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5846bf4994-z259j" podStartSLOduration=2.356634856 podStartE2EDuration="3.852159484s" podCreationTimestamp="2025-10-01 13:34:58 +0000 UTC" firstStartedPulling="2025-10-01 13:34:59.039903252 +0000 UTC m=+3430.943378830" lastFinishedPulling="2025-10-01 13:35:00.53542784 +0000 UTC m=+3432.438903458" observedRunningTime="2025-10-01 13:35:01.849571922 +0000 UTC m=+3433.753047560" watchObservedRunningTime="2025-10-01 13:35:01.852159484 +0000 UTC m=+3433.755635062" Oct 01 13:35:07 crc kubenswrapper[4913]: I1001 13:35:07.561740 4913 scope.go:117] "RemoveContainer" containerID="f245fb80f640141d9ea93f59166cf3908fd0e33bd29a5d1508d345c1fb6647e9" Oct 01 13:35:07 crc kubenswrapper[4913]: I1001 13:35:07.596780 4913 scope.go:117] "RemoveContainer" containerID="ba0649fae8ccbf4a91a923579d5ba385abe3cff70aa185780748226971ea2f51" Oct 01 13:35:08 crc kubenswrapper[4913]: I1001 13:35:08.594215 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5846bf4994-z259j" Oct 01 13:35:08 crc kubenswrapper[4913]: I1001 13:35:08.638661 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl"] Oct 01 13:35:08 crc kubenswrapper[4913]: I1001 13:35:08.639356 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" podUID="f7389885-bb97-4d1d-8c55-9d3f76eda8ed" containerName="kube-rbac-proxy" containerID="cri-o://3450c156ecabbf8a8ef9a6a53cfacf89e7ea8f968ea6bef9991416b9a89b1e4d" gracePeriod=10 Oct 01 13:35:08 crc kubenswrapper[4913]: I1001 13:35:08.639334 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" podUID="f7389885-bb97-4d1d-8c55-9d3f76eda8ed" containerName="manager" containerID="cri-o://9454bf795bfa4c31c09f5e818ff5a1389442ce13ca104ee344c25379b216f6c0" gracePeriod=10 Oct 01 13:35:08 crc kubenswrapper[4913]: I1001 13:35:08.920466 4913 generic.go:334] "Generic (PLEG): container finished" podID="f7389885-bb97-4d1d-8c55-9d3f76eda8ed" containerID="3450c156ecabbf8a8ef9a6a53cfacf89e7ea8f968ea6bef9991416b9a89b1e4d" exitCode=0 Oct 01 13:35:08 crc kubenswrapper[4913]: I1001 13:35:08.920688 4913 generic.go:334] "Generic (PLEG): container finished" podID="f7389885-bb97-4d1d-8c55-9d3f76eda8ed" containerID="9454bf795bfa4c31c09f5e818ff5a1389442ce13ca104ee344c25379b216f6c0" exitCode=0 Oct 01 13:35:08 crc kubenswrapper[4913]: I1001 13:35:08.920531 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" event={"ID":"f7389885-bb97-4d1d-8c55-9d3f76eda8ed","Type":"ContainerDied","Data":"3450c156ecabbf8a8ef9a6a53cfacf89e7ea8f968ea6bef9991416b9a89b1e4d"} Oct 01 13:35:08 crc kubenswrapper[4913]: I1001 13:35:08.920723 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" event={"ID":"f7389885-bb97-4d1d-8c55-9d3f76eda8ed","Type":"ContainerDied","Data":"9454bf795bfa4c31c09f5e818ff5a1389442ce13ca104ee344c25379b216f6c0"} Oct 01 13:35:09 crc kubenswrapper[4913]: I1001 13:35:09.083218 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" Oct 01 13:35:09 crc kubenswrapper[4913]: I1001 13:35:09.227712 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngkv6\" (UniqueName: \"kubernetes.io/projected/f7389885-bb97-4d1d-8c55-9d3f76eda8ed-kube-api-access-ngkv6\") pod \"f7389885-bb97-4d1d-8c55-9d3f76eda8ed\" (UID: \"f7389885-bb97-4d1d-8c55-9d3f76eda8ed\") " Oct 01 13:35:09 crc kubenswrapper[4913]: I1001 13:35:09.248304 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7389885-bb97-4d1d-8c55-9d3f76eda8ed-kube-api-access-ngkv6" (OuterVolumeSpecName: "kube-api-access-ngkv6") pod "f7389885-bb97-4d1d-8c55-9d3f76eda8ed" (UID: "f7389885-bb97-4d1d-8c55-9d3f76eda8ed"). InnerVolumeSpecName "kube-api-access-ngkv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:35:09 crc kubenswrapper[4913]: I1001 13:35:09.330084 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngkv6\" (UniqueName: \"kubernetes.io/projected/f7389885-bb97-4d1d-8c55-9d3f76eda8ed-kube-api-access-ngkv6\") on node \"crc\" DevicePath \"\"" Oct 01 13:35:09 crc kubenswrapper[4913]: I1001 13:35:09.932847 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" event={"ID":"f7389885-bb97-4d1d-8c55-9d3f76eda8ed","Type":"ContainerDied","Data":"ec139ab227cf61737a69428bede1db42c223edbfe4fb34470a4b721e62a07c40"} Oct 01 13:35:09 crc kubenswrapper[4913]: I1001 13:35:09.933164 4913 scope.go:117] "RemoveContainer" containerID="3450c156ecabbf8a8ef9a6a53cfacf89e7ea8f968ea6bef9991416b9a89b1e4d" Oct 01 13:35:09 crc kubenswrapper[4913]: I1001 13:35:09.932941 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl" Oct 01 13:35:09 crc kubenswrapper[4913]: I1001 13:35:09.962357 4913 scope.go:117] "RemoveContainer" containerID="9454bf795bfa4c31c09f5e818ff5a1389442ce13ca104ee344c25379b216f6c0" Oct 01 13:35:09 crc kubenswrapper[4913]: I1001 13:35:09.982404 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl"] Oct 01 13:35:09 crc kubenswrapper[4913]: I1001 13:35:09.998542 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/test-operator-controller-manager-cbdf6dc66-r9twl"] Oct 01 13:35:10 crc kubenswrapper[4913]: I1001 13:35:10.817256 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7389885-bb97-4d1d-8c55-9d3f76eda8ed" path="/var/lib/kubelet/pods/f7389885-bb97-4d1d-8c55-9d3f76eda8ed/volumes" Oct 01 13:35:32 crc kubenswrapper[4913]: I1001 13:35:32.939637 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ssr7v"] Oct 01 13:35:32 crc kubenswrapper[4913]: E1001 13:35:32.940638 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7389885-bb97-4d1d-8c55-9d3f76eda8ed" containerName="manager" Oct 01 13:35:32 crc kubenswrapper[4913]: I1001 13:35:32.940654 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7389885-bb97-4d1d-8c55-9d3f76eda8ed" containerName="manager" Oct 01 13:35:32 crc kubenswrapper[4913]: E1001 13:35:32.940708 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7389885-bb97-4d1d-8c55-9d3f76eda8ed" containerName="kube-rbac-proxy" Oct 01 13:35:32 crc kubenswrapper[4913]: I1001 13:35:32.940717 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7389885-bb97-4d1d-8c55-9d3f76eda8ed" containerName="kube-rbac-proxy" Oct 01 13:35:32 crc kubenswrapper[4913]: I1001 13:35:32.940930 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7389885-bb97-4d1d-8c55-9d3f76eda8ed" containerName="kube-rbac-proxy" Oct 01 13:35:32 crc kubenswrapper[4913]: I1001 13:35:32.940957 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7389885-bb97-4d1d-8c55-9d3f76eda8ed" containerName="manager" Oct 01 13:35:32 crc kubenswrapper[4913]: I1001 13:35:32.942659 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:32 crc kubenswrapper[4913]: I1001 13:35:32.955554 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ssr7v"] Oct 01 13:35:33 crc kubenswrapper[4913]: I1001 13:35:33.095974 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k4r8\" (UniqueName: \"kubernetes.io/projected/e959e97f-cd28-4a0d-9b23-4f355b755530-kube-api-access-2k4r8\") pod \"redhat-operators-ssr7v\" (UID: \"e959e97f-cd28-4a0d-9b23-4f355b755530\") " pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:33 crc kubenswrapper[4913]: I1001 13:35:33.096400 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e959e97f-cd28-4a0d-9b23-4f355b755530-catalog-content\") pod \"redhat-operators-ssr7v\" (UID: \"e959e97f-cd28-4a0d-9b23-4f355b755530\") " pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:33 crc kubenswrapper[4913]: I1001 13:35:33.096457 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e959e97f-cd28-4a0d-9b23-4f355b755530-utilities\") pod \"redhat-operators-ssr7v\" (UID: \"e959e97f-cd28-4a0d-9b23-4f355b755530\") " pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:33 crc kubenswrapper[4913]: I1001 13:35:33.198134 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k4r8\" (UniqueName: \"kubernetes.io/projected/e959e97f-cd28-4a0d-9b23-4f355b755530-kube-api-access-2k4r8\") pod \"redhat-operators-ssr7v\" (UID: \"e959e97f-cd28-4a0d-9b23-4f355b755530\") " pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:33 crc kubenswrapper[4913]: I1001 13:35:33.198315 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e959e97f-cd28-4a0d-9b23-4f355b755530-catalog-content\") pod \"redhat-operators-ssr7v\" (UID: \"e959e97f-cd28-4a0d-9b23-4f355b755530\") " pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:33 crc kubenswrapper[4913]: I1001 13:35:33.198337 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e959e97f-cd28-4a0d-9b23-4f355b755530-utilities\") pod \"redhat-operators-ssr7v\" (UID: \"e959e97f-cd28-4a0d-9b23-4f355b755530\") " pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:33 crc kubenswrapper[4913]: I1001 13:35:33.198872 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e959e97f-cd28-4a0d-9b23-4f355b755530-catalog-content\") pod \"redhat-operators-ssr7v\" (UID: \"e959e97f-cd28-4a0d-9b23-4f355b755530\") " pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:33 crc kubenswrapper[4913]: I1001 13:35:33.198955 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e959e97f-cd28-4a0d-9b23-4f355b755530-utilities\") pod \"redhat-operators-ssr7v\" (UID: \"e959e97f-cd28-4a0d-9b23-4f355b755530\") " pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:33 crc kubenswrapper[4913]: I1001 13:35:33.217643 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k4r8\" (UniqueName: \"kubernetes.io/projected/e959e97f-cd28-4a0d-9b23-4f355b755530-kube-api-access-2k4r8\") pod \"redhat-operators-ssr7v\" (UID: \"e959e97f-cd28-4a0d-9b23-4f355b755530\") " pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:33 crc kubenswrapper[4913]: I1001 13:35:33.264420 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:33 crc kubenswrapper[4913]: I1001 13:35:33.708649 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ssr7v"] Oct 01 13:35:33 crc kubenswrapper[4913]: W1001 13:35:33.709172 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode959e97f_cd28_4a0d_9b23_4f355b755530.slice/crio-e89adb336aa60a6fe98a616e2bce58bd59521ce0befb3823ab9050c6a563f723 WatchSource:0}: Error finding container e89adb336aa60a6fe98a616e2bce58bd59521ce0befb3823ab9050c6a563f723: Status 404 returned error can't find the container with id e89adb336aa60a6fe98a616e2bce58bd59521ce0befb3823ab9050c6a563f723 Oct 01 13:35:34 crc kubenswrapper[4913]: I1001 13:35:34.152313 4913 generic.go:334] "Generic (PLEG): container finished" podID="e959e97f-cd28-4a0d-9b23-4f355b755530" containerID="50e54e8a51ed80e0cf6064d1583dfc2e83c78b5d90d08baf6c8bec4b26edcbc6" exitCode=0 Oct 01 13:35:34 crc kubenswrapper[4913]: I1001 13:35:34.152368 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ssr7v" event={"ID":"e959e97f-cd28-4a0d-9b23-4f355b755530","Type":"ContainerDied","Data":"50e54e8a51ed80e0cf6064d1583dfc2e83c78b5d90d08baf6c8bec4b26edcbc6"} Oct 01 13:35:34 crc kubenswrapper[4913]: I1001 13:35:34.152625 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ssr7v" event={"ID":"e959e97f-cd28-4a0d-9b23-4f355b755530","Type":"ContainerStarted","Data":"e89adb336aa60a6fe98a616e2bce58bd59521ce0befb3823ab9050c6a563f723"} Oct 01 13:35:34 crc kubenswrapper[4913]: I1001 13:35:34.154749 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:35:36 crc kubenswrapper[4913]: I1001 13:35:36.171166 4913 generic.go:334] "Generic (PLEG): container finished" podID="e959e97f-cd28-4a0d-9b23-4f355b755530" containerID="e64b8192eccd656521ef225855bc6bbb5a1b2a9aa34bcb7562aadad2dbe4006c" exitCode=0 Oct 01 13:35:36 crc kubenswrapper[4913]: I1001 13:35:36.171231 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ssr7v" event={"ID":"e959e97f-cd28-4a0d-9b23-4f355b755530","Type":"ContainerDied","Data":"e64b8192eccd656521ef225855bc6bbb5a1b2a9aa34bcb7562aadad2dbe4006c"} Oct 01 13:35:37 crc kubenswrapper[4913]: I1001 13:35:37.187057 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ssr7v" event={"ID":"e959e97f-cd28-4a0d-9b23-4f355b755530","Type":"ContainerStarted","Data":"49dd31f55331f78f6bc8e40054e5fea2d6321a946fbb10fddee7b28e40b5b0cc"} Oct 01 13:35:37 crc kubenswrapper[4913]: I1001 13:35:37.209475 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ssr7v" podStartSLOduration=2.482785487 podStartE2EDuration="5.209455986s" podCreationTimestamp="2025-10-01 13:35:32 +0000 UTC" firstStartedPulling="2025-10-01 13:35:34.154473123 +0000 UTC m=+3466.057948691" lastFinishedPulling="2025-10-01 13:35:36.881143612 +0000 UTC m=+3468.784619190" observedRunningTime="2025-10-01 13:35:37.202315054 +0000 UTC m=+3469.105790642" watchObservedRunningTime="2025-10-01 13:35:37.209455986 +0000 UTC m=+3469.112931564" Oct 01 13:35:43 crc kubenswrapper[4913]: I1001 13:35:43.265167 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:43 crc kubenswrapper[4913]: I1001 13:35:43.265795 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:43 crc kubenswrapper[4913]: I1001 13:35:43.317570 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:44 crc kubenswrapper[4913]: I1001 13:35:44.291808 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:44 crc kubenswrapper[4913]: I1001 13:35:44.339925 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ssr7v"] Oct 01 13:35:46 crc kubenswrapper[4913]: I1001 13:35:46.263749 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ssr7v" podUID="e959e97f-cd28-4a0d-9b23-4f355b755530" containerName="registry-server" containerID="cri-o://49dd31f55331f78f6bc8e40054e5fea2d6321a946fbb10fddee7b28e40b5b0cc" gracePeriod=2 Oct 01 13:35:46 crc kubenswrapper[4913]: I1001 13:35:46.726415 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:46 crc kubenswrapper[4913]: I1001 13:35:46.776801 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k4r8\" (UniqueName: \"kubernetes.io/projected/e959e97f-cd28-4a0d-9b23-4f355b755530-kube-api-access-2k4r8\") pod \"e959e97f-cd28-4a0d-9b23-4f355b755530\" (UID: \"e959e97f-cd28-4a0d-9b23-4f355b755530\") " Oct 01 13:35:46 crc kubenswrapper[4913]: I1001 13:35:46.776917 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e959e97f-cd28-4a0d-9b23-4f355b755530-catalog-content\") pod \"e959e97f-cd28-4a0d-9b23-4f355b755530\" (UID: \"e959e97f-cd28-4a0d-9b23-4f355b755530\") " Oct 01 13:35:46 crc kubenswrapper[4913]: I1001 13:35:46.777002 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e959e97f-cd28-4a0d-9b23-4f355b755530-utilities\") pod \"e959e97f-cd28-4a0d-9b23-4f355b755530\" (UID: \"e959e97f-cd28-4a0d-9b23-4f355b755530\") " Oct 01 13:35:46 crc kubenswrapper[4913]: I1001 13:35:46.780319 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e959e97f-cd28-4a0d-9b23-4f355b755530-utilities" (OuterVolumeSpecName: "utilities") pod "e959e97f-cd28-4a0d-9b23-4f355b755530" (UID: "e959e97f-cd28-4a0d-9b23-4f355b755530"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:35:46 crc kubenswrapper[4913]: I1001 13:35:46.784238 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e959e97f-cd28-4a0d-9b23-4f355b755530-kube-api-access-2k4r8" (OuterVolumeSpecName: "kube-api-access-2k4r8") pod "e959e97f-cd28-4a0d-9b23-4f355b755530" (UID: "e959e97f-cd28-4a0d-9b23-4f355b755530"). InnerVolumeSpecName "kube-api-access-2k4r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:35:46 crc kubenswrapper[4913]: I1001 13:35:46.879285 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k4r8\" (UniqueName: \"kubernetes.io/projected/e959e97f-cd28-4a0d-9b23-4f355b755530-kube-api-access-2k4r8\") on node \"crc\" DevicePath \"\"" Oct 01 13:35:46 crc kubenswrapper[4913]: I1001 13:35:46.879318 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e959e97f-cd28-4a0d-9b23-4f355b755530-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:35:46 crc kubenswrapper[4913]: I1001 13:35:46.900245 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e959e97f-cd28-4a0d-9b23-4f355b755530-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e959e97f-cd28-4a0d-9b23-4f355b755530" (UID: "e959e97f-cd28-4a0d-9b23-4f355b755530"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:35:46 crc kubenswrapper[4913]: I1001 13:35:46.980819 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e959e97f-cd28-4a0d-9b23-4f355b755530-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.281190 4913 generic.go:334] "Generic (PLEG): container finished" podID="e959e97f-cd28-4a0d-9b23-4f355b755530" containerID="49dd31f55331f78f6bc8e40054e5fea2d6321a946fbb10fddee7b28e40b5b0cc" exitCode=0 Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.281502 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ssr7v" Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.281502 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ssr7v" event={"ID":"e959e97f-cd28-4a0d-9b23-4f355b755530","Type":"ContainerDied","Data":"49dd31f55331f78f6bc8e40054e5fea2d6321a946fbb10fddee7b28e40b5b0cc"} Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.281634 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ssr7v" event={"ID":"e959e97f-cd28-4a0d-9b23-4f355b755530","Type":"ContainerDied","Data":"e89adb336aa60a6fe98a616e2bce58bd59521ce0befb3823ab9050c6a563f723"} Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.281701 4913 scope.go:117] "RemoveContainer" containerID="49dd31f55331f78f6bc8e40054e5fea2d6321a946fbb10fddee7b28e40b5b0cc" Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.306289 4913 scope.go:117] "RemoveContainer" containerID="e64b8192eccd656521ef225855bc6bbb5a1b2a9aa34bcb7562aadad2dbe4006c" Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.328656 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ssr7v"] Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.335411 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ssr7v"] Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.354878 4913 scope.go:117] "RemoveContainer" containerID="50e54e8a51ed80e0cf6064d1583dfc2e83c78b5d90d08baf6c8bec4b26edcbc6" Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.404870 4913 scope.go:117] "RemoveContainer" containerID="49dd31f55331f78f6bc8e40054e5fea2d6321a946fbb10fddee7b28e40b5b0cc" Oct 01 13:35:47 crc kubenswrapper[4913]: E1001 13:35:47.405375 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49dd31f55331f78f6bc8e40054e5fea2d6321a946fbb10fddee7b28e40b5b0cc\": container with ID starting with 49dd31f55331f78f6bc8e40054e5fea2d6321a946fbb10fddee7b28e40b5b0cc not found: ID does not exist" containerID="49dd31f55331f78f6bc8e40054e5fea2d6321a946fbb10fddee7b28e40b5b0cc" Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.405437 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49dd31f55331f78f6bc8e40054e5fea2d6321a946fbb10fddee7b28e40b5b0cc"} err="failed to get container status \"49dd31f55331f78f6bc8e40054e5fea2d6321a946fbb10fddee7b28e40b5b0cc\": rpc error: code = NotFound desc = could not find container \"49dd31f55331f78f6bc8e40054e5fea2d6321a946fbb10fddee7b28e40b5b0cc\": container with ID starting with 49dd31f55331f78f6bc8e40054e5fea2d6321a946fbb10fddee7b28e40b5b0cc not found: ID does not exist" Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.405481 4913 scope.go:117] "RemoveContainer" containerID="e64b8192eccd656521ef225855bc6bbb5a1b2a9aa34bcb7562aadad2dbe4006c" Oct 01 13:35:47 crc kubenswrapper[4913]: E1001 13:35:47.405710 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e64b8192eccd656521ef225855bc6bbb5a1b2a9aa34bcb7562aadad2dbe4006c\": container with ID starting with e64b8192eccd656521ef225855bc6bbb5a1b2a9aa34bcb7562aadad2dbe4006c not found: ID does not exist" containerID="e64b8192eccd656521ef225855bc6bbb5a1b2a9aa34bcb7562aadad2dbe4006c" Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.405793 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e64b8192eccd656521ef225855bc6bbb5a1b2a9aa34bcb7562aadad2dbe4006c"} err="failed to get container status \"e64b8192eccd656521ef225855bc6bbb5a1b2a9aa34bcb7562aadad2dbe4006c\": rpc error: code = NotFound desc = could not find container \"e64b8192eccd656521ef225855bc6bbb5a1b2a9aa34bcb7562aadad2dbe4006c\": container with ID starting with e64b8192eccd656521ef225855bc6bbb5a1b2a9aa34bcb7562aadad2dbe4006c not found: ID does not exist" Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.405811 4913 scope.go:117] "RemoveContainer" containerID="50e54e8a51ed80e0cf6064d1583dfc2e83c78b5d90d08baf6c8bec4b26edcbc6" Oct 01 13:35:47 crc kubenswrapper[4913]: E1001 13:35:47.406223 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50e54e8a51ed80e0cf6064d1583dfc2e83c78b5d90d08baf6c8bec4b26edcbc6\": container with ID starting with 50e54e8a51ed80e0cf6064d1583dfc2e83c78b5d90d08baf6c8bec4b26edcbc6 not found: ID does not exist" containerID="50e54e8a51ed80e0cf6064d1583dfc2e83c78b5d90d08baf6c8bec4b26edcbc6" Oct 01 13:35:47 crc kubenswrapper[4913]: I1001 13:35:47.406286 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50e54e8a51ed80e0cf6064d1583dfc2e83c78b5d90d08baf6c8bec4b26edcbc6"} err="failed to get container status \"50e54e8a51ed80e0cf6064d1583dfc2e83c78b5d90d08baf6c8bec4b26edcbc6\": rpc error: code = NotFound desc = could not find container \"50e54e8a51ed80e0cf6064d1583dfc2e83c78b5d90d08baf6c8bec4b26edcbc6\": container with ID starting with 50e54e8a51ed80e0cf6064d1583dfc2e83c78b5d90d08baf6c8bec4b26edcbc6 not found: ID does not exist" Oct 01 13:35:48 crc kubenswrapper[4913]: I1001 13:35:48.822386 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e959e97f-cd28-4a0d-9b23-4f355b755530" path="/var/lib/kubelet/pods/e959e97f-cd28-4a0d-9b23-4f355b755530/volumes" Oct 01 13:36:10 crc kubenswrapper[4913]: I1001 13:36:10.084217 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:36:10 crc kubenswrapper[4913]: I1001 13:36:10.085027 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:36:40 crc kubenswrapper[4913]: I1001 13:36:40.083105 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:36:40 crc kubenswrapper[4913]: I1001 13:36:40.083703 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.611598 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Oct 01 13:37:09 crc kubenswrapper[4913]: E1001 13:37:09.612872 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e959e97f-cd28-4a0d-9b23-4f355b755530" containerName="registry-server" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.612899 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e959e97f-cd28-4a0d-9b23-4f355b755530" containerName="registry-server" Oct 01 13:37:09 crc kubenswrapper[4913]: E1001 13:37:09.612937 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e959e97f-cd28-4a0d-9b23-4f355b755530" containerName="extract-utilities" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.612956 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e959e97f-cd28-4a0d-9b23-4f355b755530" containerName="extract-utilities" Oct 01 13:37:09 crc kubenswrapper[4913]: E1001 13:37:09.612982 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e959e97f-cd28-4a0d-9b23-4f355b755530" containerName="extract-content" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.612992 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="e959e97f-cd28-4a0d-9b23-4f355b755530" containerName="extract-content" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.613309 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="e959e97f-cd28-4a0d-9b23-4f355b755530" containerName="registry-server" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.614506 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.618674 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qb4x7" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.619080 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.619436 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.619894 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.629921 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.634627 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.634755 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.635099 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.752636 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.754024 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.754247 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.754444 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.756424 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.755726 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.758932 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.759020 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.758929 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.759153 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p44wb\" (UniqueName: \"kubernetes.io/projected/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-kube-api-access-p44wb\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.759534 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.759560 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.775719 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.861620 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.861659 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.861684 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.861752 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.861768 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.861809 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.861880 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p44wb\" (UniqueName: \"kubernetes.io/projected/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-kube-api-access-p44wb\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.862624 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.863067 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.863132 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.867020 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.867591 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.875025 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.877508 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p44wb\" (UniqueName: \"kubernetes.io/projected/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-kube-api-access-p44wb\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.915523 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:09 crc kubenswrapper[4913]: I1001 13:37:09.940135 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Oct 01 13:37:10 crc kubenswrapper[4913]: I1001 13:37:10.093410 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:37:10 crc kubenswrapper[4913]: I1001 13:37:10.093453 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:37:10 crc kubenswrapper[4913]: I1001 13:37:10.093490 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 13:37:10 crc kubenswrapper[4913]: I1001 13:37:10.093881 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:37:10 crc kubenswrapper[4913]: I1001 13:37:10.093929 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" gracePeriod=600 Oct 01 13:37:10 crc kubenswrapper[4913]: E1001 13:37:10.228470 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:37:10 crc kubenswrapper[4913]: I1001 13:37:10.578975 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Oct 01 13:37:11 crc kubenswrapper[4913]: I1001 13:37:11.111101 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" exitCode=0 Oct 01 13:37:11 crc kubenswrapper[4913]: I1001 13:37:11.111764 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc"} Oct 01 13:37:11 crc kubenswrapper[4913]: I1001 13:37:11.111899 4913 scope.go:117] "RemoveContainer" containerID="c6822e12a17aae1690bae98ddc4a0974be6566860672dde11dc55972de0232d9" Oct 01 13:37:11 crc kubenswrapper[4913]: I1001 13:37:11.113871 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:37:11 crc kubenswrapper[4913]: E1001 13:37:11.116220 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:37:11 crc kubenswrapper[4913]: I1001 13:37:11.135580 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"3ef07e68-9334-463a-888f-1fd9fe3d3f1c","Type":"ContainerStarted","Data":"ca5bdeb82302ffd823ed9c23f3e647110192b97265fa7546d7e515d8b69e86b6"} Oct 01 13:37:23 crc kubenswrapper[4913]: I1001 13:37:23.807526 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:37:23 crc kubenswrapper[4913]: E1001 13:37:23.808291 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:37:36 crc kubenswrapper[4913]: I1001 13:37:36.809994 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:37:36 crc kubenswrapper[4913]: E1001 13:37:36.810783 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:37:47 crc kubenswrapper[4913]: E1001 13:37:47.027529 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 01 13:37:47 crc kubenswrapper[4913]: E1001 13:37:47.028363 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p44wb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest-s00-full_openstack(3ef07e68-9334-463a-888f-1fd9fe3d3f1c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:37:47 crc kubenswrapper[4913]: E1001 13:37:47.030171 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest-s00-full" podUID="3ef07e68-9334-463a-888f-1fd9fe3d3f1c" Oct 01 13:37:47 crc kubenswrapper[4913]: E1001 13:37:47.490543 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest-s00-full" podUID="3ef07e68-9334-463a-888f-1fd9fe3d3f1c" Oct 01 13:37:51 crc kubenswrapper[4913]: I1001 13:37:51.807518 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:37:51 crc kubenswrapper[4913]: E1001 13:37:51.808629 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:37:59 crc kubenswrapper[4913]: I1001 13:37:59.298327 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 01 13:38:00 crc kubenswrapper[4913]: I1001 13:38:00.612486 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"3ef07e68-9334-463a-888f-1fd9fe3d3f1c","Type":"ContainerStarted","Data":"79cbc0d54a3cf7042da99ffd6f6bffa0f768618707a168586c28b9d10ef20a92"} Oct 01 13:38:06 crc kubenswrapper[4913]: I1001 13:38:06.807171 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:38:06 crc kubenswrapper[4913]: E1001 13:38:06.808001 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:38:07 crc kubenswrapper[4913]: I1001 13:38:07.740240 4913 scope.go:117] "RemoveContainer" containerID="e8050bee28a591da5b0e35e5768ed6965cb04b322f14d7f47c34af9770801e1d" Oct 01 13:38:07 crc kubenswrapper[4913]: I1001 13:38:07.777073 4913 scope.go:117] "RemoveContainer" containerID="1bb4793c57b992463244b984d875b2a9f9e29ca45e32eda99dd8a70db1bf809c" Oct 01 13:38:21 crc kubenswrapper[4913]: I1001 13:38:21.807617 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:38:21 crc kubenswrapper[4913]: E1001 13:38:21.809866 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:38:33 crc kubenswrapper[4913]: I1001 13:38:33.807287 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:38:33 crc kubenswrapper[4913]: E1001 13:38:33.808115 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:38:48 crc kubenswrapper[4913]: I1001 13:38:48.815598 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:38:48 crc kubenswrapper[4913]: E1001 13:38:48.816378 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:39:03 crc kubenswrapper[4913]: I1001 13:39:03.807739 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:39:03 crc kubenswrapper[4913]: E1001 13:39:03.809321 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:39:14 crc kubenswrapper[4913]: I1001 13:39:14.807147 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:39:14 crc kubenswrapper[4913]: E1001 13:39:14.808830 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:39:28 crc kubenswrapper[4913]: I1001 13:39:28.814604 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:39:28 crc kubenswrapper[4913]: E1001 13:39:28.815439 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:39:40 crc kubenswrapper[4913]: I1001 13:39:40.811692 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:39:40 crc kubenswrapper[4913]: E1001 13:39:40.812337 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:39:55 crc kubenswrapper[4913]: I1001 13:39:55.808637 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:39:55 crc kubenswrapper[4913]: E1001 13:39:55.809978 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:40:06 crc kubenswrapper[4913]: I1001 13:40:06.807548 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:40:06 crc kubenswrapper[4913]: E1001 13:40:06.808356 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:40:15 crc kubenswrapper[4913]: I1001 13:40:15.032569 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s00-full" podStartSLOduration=138.321921248 podStartE2EDuration="3m7.03254911s" podCreationTimestamp="2025-10-01 13:37:08 +0000 UTC" firstStartedPulling="2025-10-01 13:37:10.58397326 +0000 UTC m=+3562.487448838" lastFinishedPulling="2025-10-01 13:37:59.294601102 +0000 UTC m=+3611.198076700" observedRunningTime="2025-10-01 13:38:00.634734915 +0000 UTC m=+3612.538210513" watchObservedRunningTime="2025-10-01 13:40:15.03254911 +0000 UTC m=+3746.936024678" Oct 01 13:40:15 crc kubenswrapper[4913]: I1001 13:40:15.039069 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-tbv5k"] Oct 01 13:40:15 crc kubenswrapper[4913]: I1001 13:40:15.047152 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-tbv5k"] Oct 01 13:40:16 crc kubenswrapper[4913]: I1001 13:40:16.820036 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9154b01b-b6c6-450a-a022-2c0c7f6ccf9b" path="/var/lib/kubelet/pods/9154b01b-b6c6-450a-a022-2c0c7f6ccf9b/volumes" Oct 01 13:40:17 crc kubenswrapper[4913]: I1001 13:40:17.806824 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:40:17 crc kubenswrapper[4913]: E1001 13:40:17.807457 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:40:21 crc kubenswrapper[4913]: I1001 13:40:21.023380 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 13:40:25 crc kubenswrapper[4913]: I1001 13:40:25.029568 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-1f2e-account-create-t5r92"] Oct 01 13:40:25 crc kubenswrapper[4913]: I1001 13:40:25.038536 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-1f2e-account-create-t5r92"] Oct 01 13:40:26 crc kubenswrapper[4913]: I1001 13:40:26.819943 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72b53680-fe1a-44b3-8b5b-e721be53113c" path="/var/lib/kubelet/pods/72b53680-fe1a-44b3-8b5b-e721be53113c/volumes" Oct 01 13:40:32 crc kubenswrapper[4913]: I1001 13:40:32.807363 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:40:32 crc kubenswrapper[4913]: E1001 13:40:32.808572 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:40:44 crc kubenswrapper[4913]: I1001 13:40:44.807215 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:40:44 crc kubenswrapper[4913]: E1001 13:40:44.808041 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:40:56 crc kubenswrapper[4913]: I1001 13:40:56.806882 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:40:56 crc kubenswrapper[4913]: E1001 13:40:56.807855 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:41:07 crc kubenswrapper[4913]: I1001 13:41:07.876478 4913 scope.go:117] "RemoveContainer" containerID="a01bca6bfdd7d77fcf5688e9f5128d9e7256252038e4637282436651c4d738ef" Oct 01 13:41:07 crc kubenswrapper[4913]: I1001 13:41:07.903139 4913 scope.go:117] "RemoveContainer" containerID="a6a4ca061bc171ba308cf95a966fabaa5ec51392e93749ab08d45083c25ef18e" Oct 01 13:41:10 crc kubenswrapper[4913]: I1001 13:41:10.806901 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:41:10 crc kubenswrapper[4913]: E1001 13:41:10.807789 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:41:23 crc kubenswrapper[4913]: I1001 13:41:23.808014 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:41:23 crc kubenswrapper[4913]: E1001 13:41:23.808875 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:41:35 crc kubenswrapper[4913]: I1001 13:41:35.807753 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:41:35 crc kubenswrapper[4913]: E1001 13:41:35.808607 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:41:50 crc kubenswrapper[4913]: I1001 13:41:50.806690 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:41:50 crc kubenswrapper[4913]: E1001 13:41:50.807935 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:42:00 crc kubenswrapper[4913]: I1001 13:42:00.054508 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-grb9h"] Oct 01 13:42:00 crc kubenswrapper[4913]: I1001 13:42:00.063476 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-grb9h"] Oct 01 13:42:00 crc kubenswrapper[4913]: I1001 13:42:00.818631 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9" path="/var/lib/kubelet/pods/4c2caa36-d4e7-423b-ae1a-25bd17c0f2c9/volumes" Oct 01 13:42:04 crc kubenswrapper[4913]: I1001 13:42:04.809261 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:42:04 crc kubenswrapper[4913]: E1001 13:42:04.810114 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:42:08 crc kubenswrapper[4913]: I1001 13:42:08.022291 4913 scope.go:117] "RemoveContainer" containerID="f3f8286a7f2f5f6971d4572620219a09aaa86eb4087cbd6fe7d687cf0ae32f90" Oct 01 13:42:18 crc kubenswrapper[4913]: I1001 13:42:18.812068 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:42:19 crc kubenswrapper[4913]: I1001 13:42:19.952161 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"8c6eb24b296590bfeed8f1a7bd7962b6adafb08ee1cc43e33f4e41339121152e"} Oct 01 13:42:50 crc kubenswrapper[4913]: I1001 13:42:50.233047 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p8qf4"] Oct 01 13:42:50 crc kubenswrapper[4913]: I1001 13:42:50.237805 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:42:50 crc kubenswrapper[4913]: I1001 13:42:50.253610 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8qf4"] Oct 01 13:42:50 crc kubenswrapper[4913]: I1001 13:42:50.296040 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac570e14-f49e-4971-90cd-b7520aa36f9a-catalog-content\") pod \"redhat-marketplace-p8qf4\" (UID: \"ac570e14-f49e-4971-90cd-b7520aa36f9a\") " pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:42:50 crc kubenswrapper[4913]: I1001 13:42:50.297016 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac570e14-f49e-4971-90cd-b7520aa36f9a-utilities\") pod \"redhat-marketplace-p8qf4\" (UID: \"ac570e14-f49e-4971-90cd-b7520aa36f9a\") " pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:42:50 crc kubenswrapper[4913]: I1001 13:42:50.297166 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2pxj\" (UniqueName: \"kubernetes.io/projected/ac570e14-f49e-4971-90cd-b7520aa36f9a-kube-api-access-q2pxj\") pod \"redhat-marketplace-p8qf4\" (UID: \"ac570e14-f49e-4971-90cd-b7520aa36f9a\") " pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:42:50 crc kubenswrapper[4913]: I1001 13:42:50.399193 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac570e14-f49e-4971-90cd-b7520aa36f9a-catalog-content\") pod \"redhat-marketplace-p8qf4\" (UID: \"ac570e14-f49e-4971-90cd-b7520aa36f9a\") " pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:42:50 crc kubenswrapper[4913]: I1001 13:42:50.399256 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac570e14-f49e-4971-90cd-b7520aa36f9a-utilities\") pod \"redhat-marketplace-p8qf4\" (UID: \"ac570e14-f49e-4971-90cd-b7520aa36f9a\") " pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:42:50 crc kubenswrapper[4913]: I1001 13:42:50.399532 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2pxj\" (UniqueName: \"kubernetes.io/projected/ac570e14-f49e-4971-90cd-b7520aa36f9a-kube-api-access-q2pxj\") pod \"redhat-marketplace-p8qf4\" (UID: \"ac570e14-f49e-4971-90cd-b7520aa36f9a\") " pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:42:50 crc kubenswrapper[4913]: I1001 13:42:50.399857 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac570e14-f49e-4971-90cd-b7520aa36f9a-catalog-content\") pod \"redhat-marketplace-p8qf4\" (UID: \"ac570e14-f49e-4971-90cd-b7520aa36f9a\") " pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:42:50 crc kubenswrapper[4913]: I1001 13:42:50.399922 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac570e14-f49e-4971-90cd-b7520aa36f9a-utilities\") pod \"redhat-marketplace-p8qf4\" (UID: \"ac570e14-f49e-4971-90cd-b7520aa36f9a\") " pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:42:50 crc kubenswrapper[4913]: I1001 13:42:50.430219 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2pxj\" (UniqueName: \"kubernetes.io/projected/ac570e14-f49e-4971-90cd-b7520aa36f9a-kube-api-access-q2pxj\") pod \"redhat-marketplace-p8qf4\" (UID: \"ac570e14-f49e-4971-90cd-b7520aa36f9a\") " pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:42:50 crc kubenswrapper[4913]: I1001 13:42:50.564145 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:42:51 crc kubenswrapper[4913]: I1001 13:42:51.062824 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8qf4"] Oct 01 13:42:51 crc kubenswrapper[4913]: I1001 13:42:51.248191 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8qf4" event={"ID":"ac570e14-f49e-4971-90cd-b7520aa36f9a","Type":"ContainerStarted","Data":"d30159fba337a4d62b4e17f907463b404333c768ac332544465bc444897a7c11"} Oct 01 13:42:52 crc kubenswrapper[4913]: I1001 13:42:52.268259 4913 generic.go:334] "Generic (PLEG): container finished" podID="ac570e14-f49e-4971-90cd-b7520aa36f9a" containerID="c559408abe4904c2472b1ef52cf273c86e4aec106423b61163a655a32625b3af" exitCode=0 Oct 01 13:42:52 crc kubenswrapper[4913]: I1001 13:42:52.268627 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8qf4" event={"ID":"ac570e14-f49e-4971-90cd-b7520aa36f9a","Type":"ContainerDied","Data":"c559408abe4904c2472b1ef52cf273c86e4aec106423b61163a655a32625b3af"} Oct 01 13:42:52 crc kubenswrapper[4913]: I1001 13:42:52.271452 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:42:54 crc kubenswrapper[4913]: I1001 13:42:54.286641 4913 generic.go:334] "Generic (PLEG): container finished" podID="ac570e14-f49e-4971-90cd-b7520aa36f9a" containerID="3f16a6c612b56bef3dbdb5ae59a4ac2add77b42123b94a63b2a319a98de1505f" exitCode=0 Oct 01 13:42:54 crc kubenswrapper[4913]: I1001 13:42:54.286720 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8qf4" event={"ID":"ac570e14-f49e-4971-90cd-b7520aa36f9a","Type":"ContainerDied","Data":"3f16a6c612b56bef3dbdb5ae59a4ac2add77b42123b94a63b2a319a98de1505f"} Oct 01 13:42:55 crc kubenswrapper[4913]: I1001 13:42:55.310262 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8qf4" event={"ID":"ac570e14-f49e-4971-90cd-b7520aa36f9a","Type":"ContainerStarted","Data":"0e24a12ec964c399deaf353a83fbf7abd59b559a7dc7c56ab270ed7c9fa32c7a"} Oct 01 13:42:55 crc kubenswrapper[4913]: I1001 13:42:55.354898 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p8qf4" podStartSLOduration=2.902662434 podStartE2EDuration="5.354875483s" podCreationTimestamp="2025-10-01 13:42:50 +0000 UTC" firstStartedPulling="2025-10-01 13:42:52.271119278 +0000 UTC m=+3904.174594856" lastFinishedPulling="2025-10-01 13:42:54.723332327 +0000 UTC m=+3906.626807905" observedRunningTime="2025-10-01 13:42:55.342746026 +0000 UTC m=+3907.246221624" watchObservedRunningTime="2025-10-01 13:42:55.354875483 +0000 UTC m=+3907.258351061" Oct 01 13:42:57 crc kubenswrapper[4913]: I1001 13:42:57.686259 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tm6sp"] Oct 01 13:42:57 crc kubenswrapper[4913]: I1001 13:42:57.691385 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:42:57 crc kubenswrapper[4913]: I1001 13:42:57.703757 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tm6sp"] Oct 01 13:42:57 crc kubenswrapper[4913]: I1001 13:42:57.765319 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb3b058-e046-4788-bd33-966befff48d4-catalog-content\") pod \"community-operators-tm6sp\" (UID: \"bcb3b058-e046-4788-bd33-966befff48d4\") " pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:42:57 crc kubenswrapper[4913]: I1001 13:42:57.765414 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb3b058-e046-4788-bd33-966befff48d4-utilities\") pod \"community-operators-tm6sp\" (UID: \"bcb3b058-e046-4788-bd33-966befff48d4\") " pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:42:57 crc kubenswrapper[4913]: I1001 13:42:57.765892 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48lhz\" (UniqueName: \"kubernetes.io/projected/bcb3b058-e046-4788-bd33-966befff48d4-kube-api-access-48lhz\") pod \"community-operators-tm6sp\" (UID: \"bcb3b058-e046-4788-bd33-966befff48d4\") " pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:42:57 crc kubenswrapper[4913]: I1001 13:42:57.868425 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb3b058-e046-4788-bd33-966befff48d4-catalog-content\") pod \"community-operators-tm6sp\" (UID: \"bcb3b058-e046-4788-bd33-966befff48d4\") " pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:42:57 crc kubenswrapper[4913]: I1001 13:42:57.868480 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb3b058-e046-4788-bd33-966befff48d4-utilities\") pod \"community-operators-tm6sp\" (UID: \"bcb3b058-e046-4788-bd33-966befff48d4\") " pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:42:57 crc kubenswrapper[4913]: I1001 13:42:57.868606 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48lhz\" (UniqueName: \"kubernetes.io/projected/bcb3b058-e046-4788-bd33-966befff48d4-kube-api-access-48lhz\") pod \"community-operators-tm6sp\" (UID: \"bcb3b058-e046-4788-bd33-966befff48d4\") " pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:42:57 crc kubenswrapper[4913]: I1001 13:42:57.869198 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb3b058-e046-4788-bd33-966befff48d4-utilities\") pod \"community-operators-tm6sp\" (UID: \"bcb3b058-e046-4788-bd33-966befff48d4\") " pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:42:57 crc kubenswrapper[4913]: I1001 13:42:57.869432 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb3b058-e046-4788-bd33-966befff48d4-catalog-content\") pod \"community-operators-tm6sp\" (UID: \"bcb3b058-e046-4788-bd33-966befff48d4\") " pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:42:57 crc kubenswrapper[4913]: I1001 13:42:57.889616 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48lhz\" (UniqueName: \"kubernetes.io/projected/bcb3b058-e046-4788-bd33-966befff48d4-kube-api-access-48lhz\") pod \"community-operators-tm6sp\" (UID: \"bcb3b058-e046-4788-bd33-966befff48d4\") " pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:42:58 crc kubenswrapper[4913]: I1001 13:42:58.024277 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:42:58 crc kubenswrapper[4913]: I1001 13:42:58.552323 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tm6sp"] Oct 01 13:42:59 crc kubenswrapper[4913]: I1001 13:42:59.351407 4913 generic.go:334] "Generic (PLEG): container finished" podID="bcb3b058-e046-4788-bd33-966befff48d4" containerID="dad0716130cf50a165a1d8d3556054414f81a2ece8848126319314d0deeb6a2e" exitCode=0 Oct 01 13:42:59 crc kubenswrapper[4913]: I1001 13:42:59.351549 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tm6sp" event={"ID":"bcb3b058-e046-4788-bd33-966befff48d4","Type":"ContainerDied","Data":"dad0716130cf50a165a1d8d3556054414f81a2ece8848126319314d0deeb6a2e"} Oct 01 13:42:59 crc kubenswrapper[4913]: I1001 13:42:59.351878 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tm6sp" event={"ID":"bcb3b058-e046-4788-bd33-966befff48d4","Type":"ContainerStarted","Data":"f429e4962dc9a20717a63b4455e6b0062da9cf8055ce92ccbdffd6a15b58d26f"} Oct 01 13:43:00 crc kubenswrapper[4913]: I1001 13:43:00.362261 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tm6sp" event={"ID":"bcb3b058-e046-4788-bd33-966befff48d4","Type":"ContainerStarted","Data":"43127f91b555ab6cc7f836751171b3dac6a070cdc5b30a3147f4fa0fe6e0f638"} Oct 01 13:43:00 crc kubenswrapper[4913]: I1001 13:43:00.565277 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:43:00 crc kubenswrapper[4913]: I1001 13:43:00.566414 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:43:00 crc kubenswrapper[4913]: I1001 13:43:00.617732 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:43:01 crc kubenswrapper[4913]: I1001 13:43:01.372719 4913 generic.go:334] "Generic (PLEG): container finished" podID="bcb3b058-e046-4788-bd33-966befff48d4" containerID="43127f91b555ab6cc7f836751171b3dac6a070cdc5b30a3147f4fa0fe6e0f638" exitCode=0 Oct 01 13:43:01 crc kubenswrapper[4913]: I1001 13:43:01.372777 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tm6sp" event={"ID":"bcb3b058-e046-4788-bd33-966befff48d4","Type":"ContainerDied","Data":"43127f91b555ab6cc7f836751171b3dac6a070cdc5b30a3147f4fa0fe6e0f638"} Oct 01 13:43:01 crc kubenswrapper[4913]: I1001 13:43:01.443007 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:43:02 crc kubenswrapper[4913]: I1001 13:43:02.382932 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tm6sp" event={"ID":"bcb3b058-e046-4788-bd33-966befff48d4","Type":"ContainerStarted","Data":"e85bba9b80f45ac40eec5d9e871efd9086bc781cbead6a315a90ee344d21d774"} Oct 01 13:43:02 crc kubenswrapper[4913]: I1001 13:43:02.402856 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tm6sp" podStartSLOduration=2.678676298 podStartE2EDuration="5.402829628s" podCreationTimestamp="2025-10-01 13:42:57 +0000 UTC" firstStartedPulling="2025-10-01 13:42:59.354007313 +0000 UTC m=+3911.257482891" lastFinishedPulling="2025-10-01 13:43:02.078160643 +0000 UTC m=+3913.981636221" observedRunningTime="2025-10-01 13:43:02.398722973 +0000 UTC m=+3914.302198551" watchObservedRunningTime="2025-10-01 13:43:02.402829628 +0000 UTC m=+3914.306305206" Oct 01 13:43:02 crc kubenswrapper[4913]: I1001 13:43:02.864114 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8qf4"] Oct 01 13:43:04 crc kubenswrapper[4913]: I1001 13:43:04.425971 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p8qf4" podUID="ac570e14-f49e-4971-90cd-b7520aa36f9a" containerName="registry-server" containerID="cri-o://0e24a12ec964c399deaf353a83fbf7abd59b559a7dc7c56ab270ed7c9fa32c7a" gracePeriod=2 Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.034585 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.110653 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac570e14-f49e-4971-90cd-b7520aa36f9a-utilities\") pod \"ac570e14-f49e-4971-90cd-b7520aa36f9a\" (UID: \"ac570e14-f49e-4971-90cd-b7520aa36f9a\") " Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.110778 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac570e14-f49e-4971-90cd-b7520aa36f9a-catalog-content\") pod \"ac570e14-f49e-4971-90cd-b7520aa36f9a\" (UID: \"ac570e14-f49e-4971-90cd-b7520aa36f9a\") " Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.110797 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2pxj\" (UniqueName: \"kubernetes.io/projected/ac570e14-f49e-4971-90cd-b7520aa36f9a-kube-api-access-q2pxj\") pod \"ac570e14-f49e-4971-90cd-b7520aa36f9a\" (UID: \"ac570e14-f49e-4971-90cd-b7520aa36f9a\") " Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.119468 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac570e14-f49e-4971-90cd-b7520aa36f9a-kube-api-access-q2pxj" (OuterVolumeSpecName: "kube-api-access-q2pxj") pod "ac570e14-f49e-4971-90cd-b7520aa36f9a" (UID: "ac570e14-f49e-4971-90cd-b7520aa36f9a"). InnerVolumeSpecName "kube-api-access-q2pxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.120307 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac570e14-f49e-4971-90cd-b7520aa36f9a-utilities" (OuterVolumeSpecName: "utilities") pod "ac570e14-f49e-4971-90cd-b7520aa36f9a" (UID: "ac570e14-f49e-4971-90cd-b7520aa36f9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.125124 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac570e14-f49e-4971-90cd-b7520aa36f9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac570e14-f49e-4971-90cd-b7520aa36f9a" (UID: "ac570e14-f49e-4971-90cd-b7520aa36f9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.212983 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac570e14-f49e-4971-90cd-b7520aa36f9a-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.213028 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac570e14-f49e-4971-90cd-b7520aa36f9a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.213042 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2pxj\" (UniqueName: \"kubernetes.io/projected/ac570e14-f49e-4971-90cd-b7520aa36f9a-kube-api-access-q2pxj\") on node \"crc\" DevicePath \"\"" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.436411 4913 generic.go:334] "Generic (PLEG): container finished" podID="ac570e14-f49e-4971-90cd-b7520aa36f9a" containerID="0e24a12ec964c399deaf353a83fbf7abd59b559a7dc7c56ab270ed7c9fa32c7a" exitCode=0 Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.436487 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8qf4" event={"ID":"ac570e14-f49e-4971-90cd-b7520aa36f9a","Type":"ContainerDied","Data":"0e24a12ec964c399deaf353a83fbf7abd59b559a7dc7c56ab270ed7c9fa32c7a"} Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.436599 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8qf4" event={"ID":"ac570e14-f49e-4971-90cd-b7520aa36f9a","Type":"ContainerDied","Data":"d30159fba337a4d62b4e17f907463b404333c768ac332544465bc444897a7c11"} Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.436633 4913 scope.go:117] "RemoveContainer" containerID="0e24a12ec964c399deaf353a83fbf7abd59b559a7dc7c56ab270ed7c9fa32c7a" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.437787 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8qf4" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.473060 4913 scope.go:117] "RemoveContainer" containerID="3f16a6c612b56bef3dbdb5ae59a4ac2add77b42123b94a63b2a319a98de1505f" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.475329 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8qf4"] Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.485978 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8qf4"] Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.507536 4913 scope.go:117] "RemoveContainer" containerID="c559408abe4904c2472b1ef52cf273c86e4aec106423b61163a655a32625b3af" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.552171 4913 scope.go:117] "RemoveContainer" containerID="0e24a12ec964c399deaf353a83fbf7abd59b559a7dc7c56ab270ed7c9fa32c7a" Oct 01 13:43:05 crc kubenswrapper[4913]: E1001 13:43:05.553624 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e24a12ec964c399deaf353a83fbf7abd59b559a7dc7c56ab270ed7c9fa32c7a\": container with ID starting with 0e24a12ec964c399deaf353a83fbf7abd59b559a7dc7c56ab270ed7c9fa32c7a not found: ID does not exist" containerID="0e24a12ec964c399deaf353a83fbf7abd59b559a7dc7c56ab270ed7c9fa32c7a" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.553687 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e24a12ec964c399deaf353a83fbf7abd59b559a7dc7c56ab270ed7c9fa32c7a"} err="failed to get container status \"0e24a12ec964c399deaf353a83fbf7abd59b559a7dc7c56ab270ed7c9fa32c7a\": rpc error: code = NotFound desc = could not find container \"0e24a12ec964c399deaf353a83fbf7abd59b559a7dc7c56ab270ed7c9fa32c7a\": container with ID starting with 0e24a12ec964c399deaf353a83fbf7abd59b559a7dc7c56ab270ed7c9fa32c7a not found: ID does not exist" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.553728 4913 scope.go:117] "RemoveContainer" containerID="3f16a6c612b56bef3dbdb5ae59a4ac2add77b42123b94a63b2a319a98de1505f" Oct 01 13:43:05 crc kubenswrapper[4913]: E1001 13:43:05.554134 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f16a6c612b56bef3dbdb5ae59a4ac2add77b42123b94a63b2a319a98de1505f\": container with ID starting with 3f16a6c612b56bef3dbdb5ae59a4ac2add77b42123b94a63b2a319a98de1505f not found: ID does not exist" containerID="3f16a6c612b56bef3dbdb5ae59a4ac2add77b42123b94a63b2a319a98de1505f" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.554234 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f16a6c612b56bef3dbdb5ae59a4ac2add77b42123b94a63b2a319a98de1505f"} err="failed to get container status \"3f16a6c612b56bef3dbdb5ae59a4ac2add77b42123b94a63b2a319a98de1505f\": rpc error: code = NotFound desc = could not find container \"3f16a6c612b56bef3dbdb5ae59a4ac2add77b42123b94a63b2a319a98de1505f\": container with ID starting with 3f16a6c612b56bef3dbdb5ae59a4ac2add77b42123b94a63b2a319a98de1505f not found: ID does not exist" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.554336 4913 scope.go:117] "RemoveContainer" containerID="c559408abe4904c2472b1ef52cf273c86e4aec106423b61163a655a32625b3af" Oct 01 13:43:05 crc kubenswrapper[4913]: E1001 13:43:05.554726 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c559408abe4904c2472b1ef52cf273c86e4aec106423b61163a655a32625b3af\": container with ID starting with c559408abe4904c2472b1ef52cf273c86e4aec106423b61163a655a32625b3af not found: ID does not exist" containerID="c559408abe4904c2472b1ef52cf273c86e4aec106423b61163a655a32625b3af" Oct 01 13:43:05 crc kubenswrapper[4913]: I1001 13:43:05.554839 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c559408abe4904c2472b1ef52cf273c86e4aec106423b61163a655a32625b3af"} err="failed to get container status \"c559408abe4904c2472b1ef52cf273c86e4aec106423b61163a655a32625b3af\": rpc error: code = NotFound desc = could not find container \"c559408abe4904c2472b1ef52cf273c86e4aec106423b61163a655a32625b3af\": container with ID starting with c559408abe4904c2472b1ef52cf273c86e4aec106423b61163a655a32625b3af not found: ID does not exist" Oct 01 13:43:06 crc kubenswrapper[4913]: I1001 13:43:06.817394 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac570e14-f49e-4971-90cd-b7520aa36f9a" path="/var/lib/kubelet/pods/ac570e14-f49e-4971-90cd-b7520aa36f9a/volumes" Oct 01 13:43:08 crc kubenswrapper[4913]: I1001 13:43:08.024640 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:43:08 crc kubenswrapper[4913]: I1001 13:43:08.024891 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:43:08 crc kubenswrapper[4913]: I1001 13:43:08.079974 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:43:08 crc kubenswrapper[4913]: I1001 13:43:08.554797 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:43:09 crc kubenswrapper[4913]: I1001 13:43:09.463127 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tm6sp"] Oct 01 13:43:10 crc kubenswrapper[4913]: I1001 13:43:10.478095 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tm6sp" podUID="bcb3b058-e046-4788-bd33-966befff48d4" containerName="registry-server" containerID="cri-o://e85bba9b80f45ac40eec5d9e871efd9086bc781cbead6a315a90ee344d21d774" gracePeriod=2 Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.128780 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.257761 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb3b058-e046-4788-bd33-966befff48d4-catalog-content\") pod \"bcb3b058-e046-4788-bd33-966befff48d4\" (UID: \"bcb3b058-e046-4788-bd33-966befff48d4\") " Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.258074 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb3b058-e046-4788-bd33-966befff48d4-utilities\") pod \"bcb3b058-e046-4788-bd33-966befff48d4\" (UID: \"bcb3b058-e046-4788-bd33-966befff48d4\") " Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.258103 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48lhz\" (UniqueName: \"kubernetes.io/projected/bcb3b058-e046-4788-bd33-966befff48d4-kube-api-access-48lhz\") pod \"bcb3b058-e046-4788-bd33-966befff48d4\" (UID: \"bcb3b058-e046-4788-bd33-966befff48d4\") " Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.258957 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcb3b058-e046-4788-bd33-966befff48d4-utilities" (OuterVolumeSpecName: "utilities") pod "bcb3b058-e046-4788-bd33-966befff48d4" (UID: "bcb3b058-e046-4788-bd33-966befff48d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.259566 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb3b058-e046-4788-bd33-966befff48d4-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.263593 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb3b058-e046-4788-bd33-966befff48d4-kube-api-access-48lhz" (OuterVolumeSpecName: "kube-api-access-48lhz") pod "bcb3b058-e046-4788-bd33-966befff48d4" (UID: "bcb3b058-e046-4788-bd33-966befff48d4"). InnerVolumeSpecName "kube-api-access-48lhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.312202 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcb3b058-e046-4788-bd33-966befff48d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcb3b058-e046-4788-bd33-966befff48d4" (UID: "bcb3b058-e046-4788-bd33-966befff48d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.361112 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb3b058-e046-4788-bd33-966befff48d4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.361147 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48lhz\" (UniqueName: \"kubernetes.io/projected/bcb3b058-e046-4788-bd33-966befff48d4-kube-api-access-48lhz\") on node \"crc\" DevicePath \"\"" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.490329 4913 generic.go:334] "Generic (PLEG): container finished" podID="bcb3b058-e046-4788-bd33-966befff48d4" containerID="e85bba9b80f45ac40eec5d9e871efd9086bc781cbead6a315a90ee344d21d774" exitCode=0 Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.490373 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tm6sp" event={"ID":"bcb3b058-e046-4788-bd33-966befff48d4","Type":"ContainerDied","Data":"e85bba9b80f45ac40eec5d9e871efd9086bc781cbead6a315a90ee344d21d774"} Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.490400 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tm6sp" event={"ID":"bcb3b058-e046-4788-bd33-966befff48d4","Type":"ContainerDied","Data":"f429e4962dc9a20717a63b4455e6b0062da9cf8055ce92ccbdffd6a15b58d26f"} Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.490416 4913 scope.go:117] "RemoveContainer" containerID="e85bba9b80f45ac40eec5d9e871efd9086bc781cbead6a315a90ee344d21d774" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.490435 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tm6sp" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.526258 4913 scope.go:117] "RemoveContainer" containerID="43127f91b555ab6cc7f836751171b3dac6a070cdc5b30a3147f4fa0fe6e0f638" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.531247 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tm6sp"] Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.539176 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tm6sp"] Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.545016 4913 scope.go:117] "RemoveContainer" containerID="dad0716130cf50a165a1d8d3556054414f81a2ece8848126319314d0deeb6a2e" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.596604 4913 scope.go:117] "RemoveContainer" containerID="e85bba9b80f45ac40eec5d9e871efd9086bc781cbead6a315a90ee344d21d774" Oct 01 13:43:11 crc kubenswrapper[4913]: E1001 13:43:11.597173 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e85bba9b80f45ac40eec5d9e871efd9086bc781cbead6a315a90ee344d21d774\": container with ID starting with e85bba9b80f45ac40eec5d9e871efd9086bc781cbead6a315a90ee344d21d774 not found: ID does not exist" containerID="e85bba9b80f45ac40eec5d9e871efd9086bc781cbead6a315a90ee344d21d774" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.597238 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e85bba9b80f45ac40eec5d9e871efd9086bc781cbead6a315a90ee344d21d774"} err="failed to get container status \"e85bba9b80f45ac40eec5d9e871efd9086bc781cbead6a315a90ee344d21d774\": rpc error: code = NotFound desc = could not find container \"e85bba9b80f45ac40eec5d9e871efd9086bc781cbead6a315a90ee344d21d774\": container with ID starting with e85bba9b80f45ac40eec5d9e871efd9086bc781cbead6a315a90ee344d21d774 not found: ID does not exist" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.597287 4913 scope.go:117] "RemoveContainer" containerID="43127f91b555ab6cc7f836751171b3dac6a070cdc5b30a3147f4fa0fe6e0f638" Oct 01 13:43:11 crc kubenswrapper[4913]: E1001 13:43:11.597815 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43127f91b555ab6cc7f836751171b3dac6a070cdc5b30a3147f4fa0fe6e0f638\": container with ID starting with 43127f91b555ab6cc7f836751171b3dac6a070cdc5b30a3147f4fa0fe6e0f638 not found: ID does not exist" containerID="43127f91b555ab6cc7f836751171b3dac6a070cdc5b30a3147f4fa0fe6e0f638" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.597848 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43127f91b555ab6cc7f836751171b3dac6a070cdc5b30a3147f4fa0fe6e0f638"} err="failed to get container status \"43127f91b555ab6cc7f836751171b3dac6a070cdc5b30a3147f4fa0fe6e0f638\": rpc error: code = NotFound desc = could not find container \"43127f91b555ab6cc7f836751171b3dac6a070cdc5b30a3147f4fa0fe6e0f638\": container with ID starting with 43127f91b555ab6cc7f836751171b3dac6a070cdc5b30a3147f4fa0fe6e0f638 not found: ID does not exist" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.597867 4913 scope.go:117] "RemoveContainer" containerID="dad0716130cf50a165a1d8d3556054414f81a2ece8848126319314d0deeb6a2e" Oct 01 13:43:11 crc kubenswrapper[4913]: E1001 13:43:11.598191 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dad0716130cf50a165a1d8d3556054414f81a2ece8848126319314d0deeb6a2e\": container with ID starting with dad0716130cf50a165a1d8d3556054414f81a2ece8848126319314d0deeb6a2e not found: ID does not exist" containerID="dad0716130cf50a165a1d8d3556054414f81a2ece8848126319314d0deeb6a2e" Oct 01 13:43:11 crc kubenswrapper[4913]: I1001 13:43:11.598287 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dad0716130cf50a165a1d8d3556054414f81a2ece8848126319314d0deeb6a2e"} err="failed to get container status \"dad0716130cf50a165a1d8d3556054414f81a2ece8848126319314d0deeb6a2e\": rpc error: code = NotFound desc = could not find container \"dad0716130cf50a165a1d8d3556054414f81a2ece8848126319314d0deeb6a2e\": container with ID starting with dad0716130cf50a165a1d8d3556054414f81a2ece8848126319314d0deeb6a2e not found: ID does not exist" Oct 01 13:43:12 crc kubenswrapper[4913]: I1001 13:43:12.817401 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb3b058-e046-4788-bd33-966befff48d4" path="/var/lib/kubelet/pods/bcb3b058-e046-4788-bd33-966befff48d4/volumes" Oct 01 13:44:40 crc kubenswrapper[4913]: I1001 13:44:40.083810 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:44:40 crc kubenswrapper[4913]: I1001 13:44:40.084365 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.150469 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn"] Oct 01 13:45:00 crc kubenswrapper[4913]: E1001 13:45:00.151516 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac570e14-f49e-4971-90cd-b7520aa36f9a" containerName="extract-utilities" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.151533 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac570e14-f49e-4971-90cd-b7520aa36f9a" containerName="extract-utilities" Oct 01 13:45:00 crc kubenswrapper[4913]: E1001 13:45:00.151546 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac570e14-f49e-4971-90cd-b7520aa36f9a" containerName="registry-server" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.151552 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac570e14-f49e-4971-90cd-b7520aa36f9a" containerName="registry-server" Oct 01 13:45:00 crc kubenswrapper[4913]: E1001 13:45:00.151566 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac570e14-f49e-4971-90cd-b7520aa36f9a" containerName="extract-content" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.151572 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac570e14-f49e-4971-90cd-b7520aa36f9a" containerName="extract-content" Oct 01 13:45:00 crc kubenswrapper[4913]: E1001 13:45:00.151578 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb3b058-e046-4788-bd33-966befff48d4" containerName="extract-content" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.151585 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb3b058-e046-4788-bd33-966befff48d4" containerName="extract-content" Oct 01 13:45:00 crc kubenswrapper[4913]: E1001 13:45:00.151603 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb3b058-e046-4788-bd33-966befff48d4" containerName="registry-server" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.151610 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb3b058-e046-4788-bd33-966befff48d4" containerName="registry-server" Oct 01 13:45:00 crc kubenswrapper[4913]: E1001 13:45:00.151618 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb3b058-e046-4788-bd33-966befff48d4" containerName="extract-utilities" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.151624 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb3b058-e046-4788-bd33-966befff48d4" containerName="extract-utilities" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.151819 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb3b058-e046-4788-bd33-966befff48d4" containerName="registry-server" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.151848 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac570e14-f49e-4971-90cd-b7520aa36f9a" containerName="registry-server" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.153456 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.162536 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.162778 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.167638 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn"] Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.278577 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2d30dfe-c157-4393-8a07-60ba3fc50e49-secret-volume\") pod \"collect-profiles-29322105-lhbtn\" (UID: \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.278660 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fhhq\" (UniqueName: \"kubernetes.io/projected/c2d30dfe-c157-4393-8a07-60ba3fc50e49-kube-api-access-8fhhq\") pod \"collect-profiles-29322105-lhbtn\" (UID: \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.279533 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2d30dfe-c157-4393-8a07-60ba3fc50e49-config-volume\") pod \"collect-profiles-29322105-lhbtn\" (UID: \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.381525 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2d30dfe-c157-4393-8a07-60ba3fc50e49-secret-volume\") pod \"collect-profiles-29322105-lhbtn\" (UID: \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.381597 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fhhq\" (UniqueName: \"kubernetes.io/projected/c2d30dfe-c157-4393-8a07-60ba3fc50e49-kube-api-access-8fhhq\") pod \"collect-profiles-29322105-lhbtn\" (UID: \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.381628 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2d30dfe-c157-4393-8a07-60ba3fc50e49-config-volume\") pod \"collect-profiles-29322105-lhbtn\" (UID: \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.382702 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2d30dfe-c157-4393-8a07-60ba3fc50e49-config-volume\") pod \"collect-profiles-29322105-lhbtn\" (UID: \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.387599 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2d30dfe-c157-4393-8a07-60ba3fc50e49-secret-volume\") pod \"collect-profiles-29322105-lhbtn\" (UID: \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.399101 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fhhq\" (UniqueName: \"kubernetes.io/projected/c2d30dfe-c157-4393-8a07-60ba3fc50e49-kube-api-access-8fhhq\") pod \"collect-profiles-29322105-lhbtn\" (UID: \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.507065 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" Oct 01 13:45:00 crc kubenswrapper[4913]: I1001 13:45:00.990477 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn"] Oct 01 13:45:01 crc kubenswrapper[4913]: I1001 13:45:01.455062 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" event={"ID":"c2d30dfe-c157-4393-8a07-60ba3fc50e49","Type":"ContainerStarted","Data":"acd7a98cd77b56d363d3d820147fbd1eb6265cb68de9a2409b7d9bff21658c5d"} Oct 01 13:45:01 crc kubenswrapper[4913]: I1001 13:45:01.455408 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" event={"ID":"c2d30dfe-c157-4393-8a07-60ba3fc50e49","Type":"ContainerStarted","Data":"70a053567d9c86fd6e1655d6e1f2e206d0bad781799e6aba6b09d0f689a94392"} Oct 01 13:45:01 crc kubenswrapper[4913]: I1001 13:45:01.476047 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" podStartSLOduration=1.476029136 podStartE2EDuration="1.476029136s" podCreationTimestamp="2025-10-01 13:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:45:01.468134817 +0000 UTC m=+4033.371610405" watchObservedRunningTime="2025-10-01 13:45:01.476029136 +0000 UTC m=+4033.379504714" Oct 01 13:45:03 crc kubenswrapper[4913]: I1001 13:45:03.478103 4913 generic.go:334] "Generic (PLEG): container finished" podID="c2d30dfe-c157-4393-8a07-60ba3fc50e49" containerID="acd7a98cd77b56d363d3d820147fbd1eb6265cb68de9a2409b7d9bff21658c5d" exitCode=0 Oct 01 13:45:03 crc kubenswrapper[4913]: I1001 13:45:03.478177 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" event={"ID":"c2d30dfe-c157-4393-8a07-60ba3fc50e49","Type":"ContainerDied","Data":"acd7a98cd77b56d363d3d820147fbd1eb6265cb68de9a2409b7d9bff21658c5d"} Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.011133 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.181185 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2d30dfe-c157-4393-8a07-60ba3fc50e49-config-volume\") pod \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\" (UID: \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\") " Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.181307 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fhhq\" (UniqueName: \"kubernetes.io/projected/c2d30dfe-c157-4393-8a07-60ba3fc50e49-kube-api-access-8fhhq\") pod \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\" (UID: \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\") " Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.181381 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2d30dfe-c157-4393-8a07-60ba3fc50e49-secret-volume\") pod \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\" (UID: \"c2d30dfe-c157-4393-8a07-60ba3fc50e49\") " Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.182092 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d30dfe-c157-4393-8a07-60ba3fc50e49-config-volume" (OuterVolumeSpecName: "config-volume") pod "c2d30dfe-c157-4393-8a07-60ba3fc50e49" (UID: "c2d30dfe-c157-4393-8a07-60ba3fc50e49"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.188352 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d30dfe-c157-4393-8a07-60ba3fc50e49-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c2d30dfe-c157-4393-8a07-60ba3fc50e49" (UID: "c2d30dfe-c157-4393-8a07-60ba3fc50e49"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.200773 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d30dfe-c157-4393-8a07-60ba3fc50e49-kube-api-access-8fhhq" (OuterVolumeSpecName: "kube-api-access-8fhhq") pod "c2d30dfe-c157-4393-8a07-60ba3fc50e49" (UID: "c2d30dfe-c157-4393-8a07-60ba3fc50e49"). InnerVolumeSpecName "kube-api-access-8fhhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.283526 4913 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2d30dfe-c157-4393-8a07-60ba3fc50e49-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.283565 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fhhq\" (UniqueName: \"kubernetes.io/projected/c2d30dfe-c157-4393-8a07-60ba3fc50e49-kube-api-access-8fhhq\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.283580 4913 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2d30dfe-c157-4393-8a07-60ba3fc50e49-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.498212 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" event={"ID":"c2d30dfe-c157-4393-8a07-60ba3fc50e49","Type":"ContainerDied","Data":"70a053567d9c86fd6e1655d6e1f2e206d0bad781799e6aba6b09d0f689a94392"} Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.498265 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn" Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.498285 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70a053567d9c86fd6e1655d6e1f2e206d0bad781799e6aba6b09d0f689a94392" Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.577879 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw"] Oct 01 13:45:05 crc kubenswrapper[4913]: I1001 13:45:05.586232 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-8vzmw"] Oct 01 13:45:06 crc kubenswrapper[4913]: I1001 13:45:06.818348 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45859544-5cfd-44cb-8414-3c21d7256c2a" path="/var/lib/kubelet/pods/45859544-5cfd-44cb-8414-3c21d7256c2a/volumes" Oct 01 13:45:08 crc kubenswrapper[4913]: I1001 13:45:08.234225 4913 scope.go:117] "RemoveContainer" containerID="f7bb353c4fa40aeec972fd5d5327b3f88a4ac37c68cfe3a110296315779e7bb4" Oct 01 13:45:10 crc kubenswrapper[4913]: I1001 13:45:10.083969 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:45:10 crc kubenswrapper[4913]: I1001 13:45:10.084544 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:45:40 crc kubenswrapper[4913]: I1001 13:45:40.084094 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:45:40 crc kubenswrapper[4913]: I1001 13:45:40.084652 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:45:40 crc kubenswrapper[4913]: I1001 13:45:40.084709 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 13:45:40 crc kubenswrapper[4913]: I1001 13:45:40.085506 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c6eb24b296590bfeed8f1a7bd7962b6adafb08ee1cc43e33f4e41339121152e"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:45:40 crc kubenswrapper[4913]: I1001 13:45:40.085619 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://8c6eb24b296590bfeed8f1a7bd7962b6adafb08ee1cc43e33f4e41339121152e" gracePeriod=600 Oct 01 13:45:40 crc kubenswrapper[4913]: I1001 13:45:40.797210 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="8c6eb24b296590bfeed8f1a7bd7962b6adafb08ee1cc43e33f4e41339121152e" exitCode=0 Oct 01 13:45:40 crc kubenswrapper[4913]: I1001 13:45:40.797359 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"8c6eb24b296590bfeed8f1a7bd7962b6adafb08ee1cc43e33f4e41339121152e"} Oct 01 13:45:40 crc kubenswrapper[4913]: I1001 13:45:40.797637 4913 scope.go:117] "RemoveContainer" containerID="e1876f25a3b6187be9eb00ea84d789e764c028b31dfcc036709017732cfb8dfc" Oct 01 13:45:41 crc kubenswrapper[4913]: I1001 13:45:41.809100 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01"} Oct 01 13:47:40 crc kubenswrapper[4913]: I1001 13:47:40.083867 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:47:40 crc kubenswrapper[4913]: I1001 13:47:40.084883 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:48:10 crc kubenswrapper[4913]: I1001 13:48:10.083825 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:48:10 crc kubenswrapper[4913]: I1001 13:48:10.084389 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:48:40 crc kubenswrapper[4913]: I1001 13:48:40.083562 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:48:40 crc kubenswrapper[4913]: I1001 13:48:40.084132 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:48:40 crc kubenswrapper[4913]: I1001 13:48:40.084180 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 13:48:40 crc kubenswrapper[4913]: I1001 13:48:40.084943 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:48:40 crc kubenswrapper[4913]: I1001 13:48:40.085012 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" gracePeriod=600 Oct 01 13:48:40 crc kubenswrapper[4913]: E1001 13:48:40.209769 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:48:40 crc kubenswrapper[4913]: I1001 13:48:40.355667 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" exitCode=0 Oct 01 13:48:40 crc kubenswrapper[4913]: I1001 13:48:40.355744 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01"} Oct 01 13:48:40 crc kubenswrapper[4913]: I1001 13:48:40.355784 4913 scope.go:117] "RemoveContainer" containerID="8c6eb24b296590bfeed8f1a7bd7962b6adafb08ee1cc43e33f4e41339121152e" Oct 01 13:48:40 crc kubenswrapper[4913]: I1001 13:48:40.356527 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:48:40 crc kubenswrapper[4913]: E1001 13:48:40.356816 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:48:53 crc kubenswrapper[4913]: I1001 13:48:53.806683 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:48:53 crc kubenswrapper[4913]: E1001 13:48:53.807698 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:49:05 crc kubenswrapper[4913]: I1001 13:49:05.806462 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:49:05 crc kubenswrapper[4913]: E1001 13:49:05.807367 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:49:16 crc kubenswrapper[4913]: I1001 13:49:16.808633 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:49:16 crc kubenswrapper[4913]: E1001 13:49:16.809582 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:49:31 crc kubenswrapper[4913]: I1001 13:49:31.807373 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:49:31 crc kubenswrapper[4913]: E1001 13:49:31.808993 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.134518 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5679q"] Oct 01 13:49:45 crc kubenswrapper[4913]: E1001 13:49:45.135859 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d30dfe-c157-4393-8a07-60ba3fc50e49" containerName="collect-profiles" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.135879 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d30dfe-c157-4393-8a07-60ba3fc50e49" containerName="collect-profiles" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.136167 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d30dfe-c157-4393-8a07-60ba3fc50e49" containerName="collect-profiles" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.138116 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.153062 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5679q"] Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.254965 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e3569b7-6f25-43de-9543-7456b580ed0b-utilities\") pod \"certified-operators-5679q\" (UID: \"3e3569b7-6f25-43de-9543-7456b580ed0b\") " pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.255149 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt5qn\" (UniqueName: \"kubernetes.io/projected/3e3569b7-6f25-43de-9543-7456b580ed0b-kube-api-access-tt5qn\") pod \"certified-operators-5679q\" (UID: \"3e3569b7-6f25-43de-9543-7456b580ed0b\") " pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.255244 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e3569b7-6f25-43de-9543-7456b580ed0b-catalog-content\") pod \"certified-operators-5679q\" (UID: \"3e3569b7-6f25-43de-9543-7456b580ed0b\") " pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.357164 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e3569b7-6f25-43de-9543-7456b580ed0b-catalog-content\") pod \"certified-operators-5679q\" (UID: \"3e3569b7-6f25-43de-9543-7456b580ed0b\") " pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.357591 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e3569b7-6f25-43de-9543-7456b580ed0b-utilities\") pod \"certified-operators-5679q\" (UID: \"3e3569b7-6f25-43de-9543-7456b580ed0b\") " pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.357796 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt5qn\" (UniqueName: \"kubernetes.io/projected/3e3569b7-6f25-43de-9543-7456b580ed0b-kube-api-access-tt5qn\") pod \"certified-operators-5679q\" (UID: \"3e3569b7-6f25-43de-9543-7456b580ed0b\") " pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.357834 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e3569b7-6f25-43de-9543-7456b580ed0b-catalog-content\") pod \"certified-operators-5679q\" (UID: \"3e3569b7-6f25-43de-9543-7456b580ed0b\") " pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.358012 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e3569b7-6f25-43de-9543-7456b580ed0b-utilities\") pod \"certified-operators-5679q\" (UID: \"3e3569b7-6f25-43de-9543-7456b580ed0b\") " pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.383646 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt5qn\" (UniqueName: \"kubernetes.io/projected/3e3569b7-6f25-43de-9543-7456b580ed0b-kube-api-access-tt5qn\") pod \"certified-operators-5679q\" (UID: \"3e3569b7-6f25-43de-9543-7456b580ed0b\") " pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.459886 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:45 crc kubenswrapper[4913]: I1001 13:49:45.806476 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:49:45 crc kubenswrapper[4913]: E1001 13:49:45.806938 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:49:46 crc kubenswrapper[4913]: I1001 13:49:46.063137 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5679q"] Oct 01 13:49:46 crc kubenswrapper[4913]: I1001 13:49:46.980680 4913 generic.go:334] "Generic (PLEG): container finished" podID="3e3569b7-6f25-43de-9543-7456b580ed0b" containerID="8c4eac42e2c4db574456807225081636534abd7b3bad903ae39e366a269256d7" exitCode=0 Oct 01 13:49:46 crc kubenswrapper[4913]: I1001 13:49:46.981012 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5679q" event={"ID":"3e3569b7-6f25-43de-9543-7456b580ed0b","Type":"ContainerDied","Data":"8c4eac42e2c4db574456807225081636534abd7b3bad903ae39e366a269256d7"} Oct 01 13:49:46 crc kubenswrapper[4913]: I1001 13:49:46.981046 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5679q" event={"ID":"3e3569b7-6f25-43de-9543-7456b580ed0b","Type":"ContainerStarted","Data":"770ddcd559bcac472f937e55fc94d1cadcb4c865cdeb3c49ce3aa9b598f62abf"} Oct 01 13:49:46 crc kubenswrapper[4913]: I1001 13:49:46.982706 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:49:47 crc kubenswrapper[4913]: I1001 13:49:47.713501 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c5js9"] Oct 01 13:49:47 crc kubenswrapper[4913]: I1001 13:49:47.715797 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:49:47 crc kubenswrapper[4913]: I1001 13:49:47.730214 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c5js9"] Oct 01 13:49:47 crc kubenswrapper[4913]: I1001 13:49:47.804548 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-utilities\") pod \"redhat-operators-c5js9\" (UID: \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\") " pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:49:47 crc kubenswrapper[4913]: I1001 13:49:47.804936 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-catalog-content\") pod \"redhat-operators-c5js9\" (UID: \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\") " pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:49:47 crc kubenswrapper[4913]: I1001 13:49:47.805001 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ghtq\" (UniqueName: \"kubernetes.io/projected/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-kube-api-access-9ghtq\") pod \"redhat-operators-c5js9\" (UID: \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\") " pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:49:47 crc kubenswrapper[4913]: I1001 13:49:47.907186 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-catalog-content\") pod \"redhat-operators-c5js9\" (UID: \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\") " pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:49:47 crc kubenswrapper[4913]: I1001 13:49:47.907237 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ghtq\" (UniqueName: \"kubernetes.io/projected/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-kube-api-access-9ghtq\") pod \"redhat-operators-c5js9\" (UID: \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\") " pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:49:47 crc kubenswrapper[4913]: I1001 13:49:47.907342 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-utilities\") pod \"redhat-operators-c5js9\" (UID: \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\") " pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:49:47 crc kubenswrapper[4913]: I1001 13:49:47.907938 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-catalog-content\") pod \"redhat-operators-c5js9\" (UID: \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\") " pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:49:47 crc kubenswrapper[4913]: I1001 13:49:47.907970 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-utilities\") pod \"redhat-operators-c5js9\" (UID: \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\") " pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:49:47 crc kubenswrapper[4913]: I1001 13:49:47.927951 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ghtq\" (UniqueName: \"kubernetes.io/projected/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-kube-api-access-9ghtq\") pod \"redhat-operators-c5js9\" (UID: \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\") " pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:49:48 crc kubenswrapper[4913]: I1001 13:49:48.043239 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:49:48 crc kubenswrapper[4913]: I1001 13:49:48.517672 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c5js9"] Oct 01 13:49:48 crc kubenswrapper[4913]: W1001 13:49:48.532398 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97b54ddc_5a2c_4559_bc55_4ea24b8aeb12.slice/crio-d97a123eb01169560427260175e3d580353361c3bc3a0a43ea334bcfda6887af WatchSource:0}: Error finding container d97a123eb01169560427260175e3d580353361c3bc3a0a43ea334bcfda6887af: Status 404 returned error can't find the container with id d97a123eb01169560427260175e3d580353361c3bc3a0a43ea334bcfda6887af Oct 01 13:49:48 crc kubenswrapper[4913]: I1001 13:49:48.998220 4913 generic.go:334] "Generic (PLEG): container finished" podID="97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" containerID="762f06672b31088d151649879fcafa133a098b7857528b50b0949c4ab44f64de" exitCode=0 Oct 01 13:49:48 crc kubenswrapper[4913]: I1001 13:49:48.998402 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c5js9" event={"ID":"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12","Type":"ContainerDied","Data":"762f06672b31088d151649879fcafa133a098b7857528b50b0949c4ab44f64de"} Oct 01 13:49:48 crc kubenswrapper[4913]: I1001 13:49:48.998603 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c5js9" event={"ID":"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12","Type":"ContainerStarted","Data":"d97a123eb01169560427260175e3d580353361c3bc3a0a43ea334bcfda6887af"} Oct 01 13:49:49 crc kubenswrapper[4913]: I1001 13:49:49.000767 4913 generic.go:334] "Generic (PLEG): container finished" podID="3e3569b7-6f25-43de-9543-7456b580ed0b" containerID="2a56214b4f02bba3484cc7ade3ae669bd6f3cec9b8641c4c686172b4317839fc" exitCode=0 Oct 01 13:49:49 crc kubenswrapper[4913]: I1001 13:49:49.000813 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5679q" event={"ID":"3e3569b7-6f25-43de-9543-7456b580ed0b","Type":"ContainerDied","Data":"2a56214b4f02bba3484cc7ade3ae669bd6f3cec9b8641c4c686172b4317839fc"} Oct 01 13:49:50 crc kubenswrapper[4913]: I1001 13:49:50.014173 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c5js9" event={"ID":"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12","Type":"ContainerStarted","Data":"ae1dd17e2469130c5b492dfbc47384f568c8c2f7ea86fd38715721ab8279629c"} Oct 01 13:49:50 crc kubenswrapper[4913]: I1001 13:49:50.016328 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5679q" event={"ID":"3e3569b7-6f25-43de-9543-7456b580ed0b","Type":"ContainerStarted","Data":"50239f8d25a3d528eefca3c6a0b202cbef9c1fe0a5b43df4bd6440ff768ea807"} Oct 01 13:49:50 crc kubenswrapper[4913]: I1001 13:49:50.050727 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5679q" podStartSLOduration=2.643517512 podStartE2EDuration="5.050704609s" podCreationTimestamp="2025-10-01 13:49:45 +0000 UTC" firstStartedPulling="2025-10-01 13:49:46.982517848 +0000 UTC m=+4318.885993426" lastFinishedPulling="2025-10-01 13:49:49.389704945 +0000 UTC m=+4321.293180523" observedRunningTime="2025-10-01 13:49:50.045423062 +0000 UTC m=+4321.948898660" watchObservedRunningTime="2025-10-01 13:49:50.050704609 +0000 UTC m=+4321.954180197" Oct 01 13:49:55 crc kubenswrapper[4913]: I1001 13:49:55.460283 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:55 crc kubenswrapper[4913]: I1001 13:49:55.460891 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:55 crc kubenswrapper[4913]: I1001 13:49:55.506226 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:56 crc kubenswrapper[4913]: I1001 13:49:56.067024 4913 generic.go:334] "Generic (PLEG): container finished" podID="97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" containerID="ae1dd17e2469130c5b492dfbc47384f568c8c2f7ea86fd38715721ab8279629c" exitCode=0 Oct 01 13:49:56 crc kubenswrapper[4913]: I1001 13:49:56.067118 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c5js9" event={"ID":"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12","Type":"ContainerDied","Data":"ae1dd17e2469130c5b492dfbc47384f568c8c2f7ea86fd38715721ab8279629c"} Oct 01 13:49:56 crc kubenswrapper[4913]: I1001 13:49:56.121916 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:57 crc kubenswrapper[4913]: I1001 13:49:57.082187 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c5js9" event={"ID":"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12","Type":"ContainerStarted","Data":"501be451b67bf5bfb57a22ef96ee7a80b46a8c64b777264fd8f7437d5bbec65a"} Oct 01 13:49:57 crc kubenswrapper[4913]: I1001 13:49:57.104504 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5679q"] Oct 01 13:49:57 crc kubenswrapper[4913]: I1001 13:49:57.105461 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c5js9" podStartSLOduration=2.6150061449999997 podStartE2EDuration="10.10545075s" podCreationTimestamp="2025-10-01 13:49:47 +0000 UTC" firstStartedPulling="2025-10-01 13:49:49.000640822 +0000 UTC m=+4320.904116400" lastFinishedPulling="2025-10-01 13:49:56.491085427 +0000 UTC m=+4328.394561005" observedRunningTime="2025-10-01 13:49:57.100207364 +0000 UTC m=+4329.003682952" watchObservedRunningTime="2025-10-01 13:49:57.10545075 +0000 UTC m=+4329.008926328" Oct 01 13:49:58 crc kubenswrapper[4913]: I1001 13:49:58.044175 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:49:58 crc kubenswrapper[4913]: I1001 13:49:58.044459 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:49:58 crc kubenswrapper[4913]: I1001 13:49:58.096078 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5679q" podUID="3e3569b7-6f25-43de-9543-7456b580ed0b" containerName="registry-server" containerID="cri-o://50239f8d25a3d528eefca3c6a0b202cbef9c1fe0a5b43df4bd6440ff768ea807" gracePeriod=2 Oct 01 13:49:58 crc kubenswrapper[4913]: I1001 13:49:58.741430 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:58 crc kubenswrapper[4913]: I1001 13:49:58.927787 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt5qn\" (UniqueName: \"kubernetes.io/projected/3e3569b7-6f25-43de-9543-7456b580ed0b-kube-api-access-tt5qn\") pod \"3e3569b7-6f25-43de-9543-7456b580ed0b\" (UID: \"3e3569b7-6f25-43de-9543-7456b580ed0b\") " Oct 01 13:49:58 crc kubenswrapper[4913]: I1001 13:49:58.927830 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e3569b7-6f25-43de-9543-7456b580ed0b-utilities\") pod \"3e3569b7-6f25-43de-9543-7456b580ed0b\" (UID: \"3e3569b7-6f25-43de-9543-7456b580ed0b\") " Oct 01 13:49:58 crc kubenswrapper[4913]: I1001 13:49:58.928022 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e3569b7-6f25-43de-9543-7456b580ed0b-catalog-content\") pod \"3e3569b7-6f25-43de-9543-7456b580ed0b\" (UID: \"3e3569b7-6f25-43de-9543-7456b580ed0b\") " Oct 01 13:49:58 crc kubenswrapper[4913]: I1001 13:49:58.928804 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e3569b7-6f25-43de-9543-7456b580ed0b-utilities" (OuterVolumeSpecName: "utilities") pod "3e3569b7-6f25-43de-9543-7456b580ed0b" (UID: "3e3569b7-6f25-43de-9543-7456b580ed0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:49:58 crc kubenswrapper[4913]: I1001 13:49:58.934648 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3569b7-6f25-43de-9543-7456b580ed0b-kube-api-access-tt5qn" (OuterVolumeSpecName: "kube-api-access-tt5qn") pod "3e3569b7-6f25-43de-9543-7456b580ed0b" (UID: "3e3569b7-6f25-43de-9543-7456b580ed0b"). InnerVolumeSpecName "kube-api-access-tt5qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:49:58 crc kubenswrapper[4913]: I1001 13:49:58.971498 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e3569b7-6f25-43de-9543-7456b580ed0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e3569b7-6f25-43de-9543-7456b580ed0b" (UID: "3e3569b7-6f25-43de-9543-7456b580ed0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.030874 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt5qn\" (UniqueName: \"kubernetes.io/projected/3e3569b7-6f25-43de-9543-7456b580ed0b-kube-api-access-tt5qn\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.030907 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e3569b7-6f25-43de-9543-7456b580ed0b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.030917 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e3569b7-6f25-43de-9543-7456b580ed0b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.094161 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c5js9" podUID="97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" containerName="registry-server" probeResult="failure" output=< Oct 01 13:49:59 crc kubenswrapper[4913]: timeout: failed to connect service ":50051" within 1s Oct 01 13:49:59 crc kubenswrapper[4913]: > Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.102300 4913 generic.go:334] "Generic (PLEG): container finished" podID="3e3569b7-6f25-43de-9543-7456b580ed0b" containerID="50239f8d25a3d528eefca3c6a0b202cbef9c1fe0a5b43df4bd6440ff768ea807" exitCode=0 Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.102339 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5679q" event={"ID":"3e3569b7-6f25-43de-9543-7456b580ed0b","Type":"ContainerDied","Data":"50239f8d25a3d528eefca3c6a0b202cbef9c1fe0a5b43df4bd6440ff768ea807"} Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.102365 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5679q" event={"ID":"3e3569b7-6f25-43de-9543-7456b580ed0b","Type":"ContainerDied","Data":"770ddcd559bcac472f937e55fc94d1cadcb4c865cdeb3c49ce3aa9b598f62abf"} Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.102381 4913 scope.go:117] "RemoveContainer" containerID="50239f8d25a3d528eefca3c6a0b202cbef9c1fe0a5b43df4bd6440ff768ea807" Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.102383 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5679q" Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.125019 4913 scope.go:117] "RemoveContainer" containerID="2a56214b4f02bba3484cc7ade3ae669bd6f3cec9b8641c4c686172b4317839fc" Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.146816 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5679q"] Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.154437 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5679q"] Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.157061 4913 scope.go:117] "RemoveContainer" containerID="8c4eac42e2c4db574456807225081636534abd7b3bad903ae39e366a269256d7" Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.189939 4913 scope.go:117] "RemoveContainer" containerID="50239f8d25a3d528eefca3c6a0b202cbef9c1fe0a5b43df4bd6440ff768ea807" Oct 01 13:49:59 crc kubenswrapper[4913]: E1001 13:49:59.190392 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50239f8d25a3d528eefca3c6a0b202cbef9c1fe0a5b43df4bd6440ff768ea807\": container with ID starting with 50239f8d25a3d528eefca3c6a0b202cbef9c1fe0a5b43df4bd6440ff768ea807 not found: ID does not exist" containerID="50239f8d25a3d528eefca3c6a0b202cbef9c1fe0a5b43df4bd6440ff768ea807" Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.190438 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50239f8d25a3d528eefca3c6a0b202cbef9c1fe0a5b43df4bd6440ff768ea807"} err="failed to get container status \"50239f8d25a3d528eefca3c6a0b202cbef9c1fe0a5b43df4bd6440ff768ea807\": rpc error: code = NotFound desc = could not find container \"50239f8d25a3d528eefca3c6a0b202cbef9c1fe0a5b43df4bd6440ff768ea807\": container with ID starting with 50239f8d25a3d528eefca3c6a0b202cbef9c1fe0a5b43df4bd6440ff768ea807 not found: ID does not exist" Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.190465 4913 scope.go:117] "RemoveContainer" containerID="2a56214b4f02bba3484cc7ade3ae669bd6f3cec9b8641c4c686172b4317839fc" Oct 01 13:49:59 crc kubenswrapper[4913]: E1001 13:49:59.190803 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a56214b4f02bba3484cc7ade3ae669bd6f3cec9b8641c4c686172b4317839fc\": container with ID starting with 2a56214b4f02bba3484cc7ade3ae669bd6f3cec9b8641c4c686172b4317839fc not found: ID does not exist" containerID="2a56214b4f02bba3484cc7ade3ae669bd6f3cec9b8641c4c686172b4317839fc" Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.190839 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a56214b4f02bba3484cc7ade3ae669bd6f3cec9b8641c4c686172b4317839fc"} err="failed to get container status \"2a56214b4f02bba3484cc7ade3ae669bd6f3cec9b8641c4c686172b4317839fc\": rpc error: code = NotFound desc = could not find container \"2a56214b4f02bba3484cc7ade3ae669bd6f3cec9b8641c4c686172b4317839fc\": container with ID starting with 2a56214b4f02bba3484cc7ade3ae669bd6f3cec9b8641c4c686172b4317839fc not found: ID does not exist" Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.190865 4913 scope.go:117] "RemoveContainer" containerID="8c4eac42e2c4db574456807225081636534abd7b3bad903ae39e366a269256d7" Oct 01 13:49:59 crc kubenswrapper[4913]: E1001 13:49:59.191097 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c4eac42e2c4db574456807225081636534abd7b3bad903ae39e366a269256d7\": container with ID starting with 8c4eac42e2c4db574456807225081636534abd7b3bad903ae39e366a269256d7 not found: ID does not exist" containerID="8c4eac42e2c4db574456807225081636534abd7b3bad903ae39e366a269256d7" Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.191121 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c4eac42e2c4db574456807225081636534abd7b3bad903ae39e366a269256d7"} err="failed to get container status \"8c4eac42e2c4db574456807225081636534abd7b3bad903ae39e366a269256d7\": rpc error: code = NotFound desc = could not find container \"8c4eac42e2c4db574456807225081636534abd7b3bad903ae39e366a269256d7\": container with ID starting with 8c4eac42e2c4db574456807225081636534abd7b3bad903ae39e366a269256d7 not found: ID does not exist" Oct 01 13:49:59 crc kubenswrapper[4913]: I1001 13:49:59.807947 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:49:59 crc kubenswrapper[4913]: E1001 13:49:59.808912 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:50:00 crc kubenswrapper[4913]: I1001 13:50:00.819952 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e3569b7-6f25-43de-9543-7456b580ed0b" path="/var/lib/kubelet/pods/3e3569b7-6f25-43de-9543-7456b580ed0b/volumes" Oct 01 13:50:08 crc kubenswrapper[4913]: I1001 13:50:08.108709 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:50:08 crc kubenswrapper[4913]: I1001 13:50:08.161423 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:50:08 crc kubenswrapper[4913]: I1001 13:50:08.349175 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c5js9"] Oct 01 13:50:09 crc kubenswrapper[4913]: I1001 13:50:09.197745 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c5js9" podUID="97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" containerName="registry-server" containerID="cri-o://501be451b67bf5bfb57a22ef96ee7a80b46a8c64b777264fd8f7437d5bbec65a" gracePeriod=2 Oct 01 13:50:09 crc kubenswrapper[4913]: I1001 13:50:09.836661 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:50:09 crc kubenswrapper[4913]: I1001 13:50:09.952695 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ghtq\" (UniqueName: \"kubernetes.io/projected/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-kube-api-access-9ghtq\") pod \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\" (UID: \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\") " Oct 01 13:50:09 crc kubenswrapper[4913]: I1001 13:50:09.952849 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-utilities\") pod \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\" (UID: \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\") " Oct 01 13:50:09 crc kubenswrapper[4913]: I1001 13:50:09.952968 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-catalog-content\") pod \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\" (UID: \"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12\") " Oct 01 13:50:09 crc kubenswrapper[4913]: I1001 13:50:09.954803 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-utilities" (OuterVolumeSpecName: "utilities") pod "97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" (UID: "97b54ddc-5a2c-4559-bc55-4ea24b8aeb12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:50:09 crc kubenswrapper[4913]: I1001 13:50:09.955878 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:09 crc kubenswrapper[4913]: I1001 13:50:09.979795 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-kube-api-access-9ghtq" (OuterVolumeSpecName: "kube-api-access-9ghtq") pod "97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" (UID: "97b54ddc-5a2c-4559-bc55-4ea24b8aeb12"). InnerVolumeSpecName "kube-api-access-9ghtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.058473 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ghtq\" (UniqueName: \"kubernetes.io/projected/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-kube-api-access-9ghtq\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.066672 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" (UID: "97b54ddc-5a2c-4559-bc55-4ea24b8aeb12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.161034 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.206581 4913 generic.go:334] "Generic (PLEG): container finished" podID="97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" containerID="501be451b67bf5bfb57a22ef96ee7a80b46a8c64b777264fd8f7437d5bbec65a" exitCode=0 Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.206622 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c5js9" event={"ID":"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12","Type":"ContainerDied","Data":"501be451b67bf5bfb57a22ef96ee7a80b46a8c64b777264fd8f7437d5bbec65a"} Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.206647 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c5js9" event={"ID":"97b54ddc-5a2c-4559-bc55-4ea24b8aeb12","Type":"ContainerDied","Data":"d97a123eb01169560427260175e3d580353361c3bc3a0a43ea334bcfda6887af"} Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.206664 4913 scope.go:117] "RemoveContainer" containerID="501be451b67bf5bfb57a22ef96ee7a80b46a8c64b777264fd8f7437d5bbec65a" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.206788 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c5js9" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.239594 4913 scope.go:117] "RemoveContainer" containerID="ae1dd17e2469130c5b492dfbc47384f568c8c2f7ea86fd38715721ab8279629c" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.241668 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c5js9"] Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.249436 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c5js9"] Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.264986 4913 scope.go:117] "RemoveContainer" containerID="762f06672b31088d151649879fcafa133a098b7857528b50b0949c4ab44f64de" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.308690 4913 scope.go:117] "RemoveContainer" containerID="501be451b67bf5bfb57a22ef96ee7a80b46a8c64b777264fd8f7437d5bbec65a" Oct 01 13:50:10 crc kubenswrapper[4913]: E1001 13:50:10.309200 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"501be451b67bf5bfb57a22ef96ee7a80b46a8c64b777264fd8f7437d5bbec65a\": container with ID starting with 501be451b67bf5bfb57a22ef96ee7a80b46a8c64b777264fd8f7437d5bbec65a not found: ID does not exist" containerID="501be451b67bf5bfb57a22ef96ee7a80b46a8c64b777264fd8f7437d5bbec65a" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.309257 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"501be451b67bf5bfb57a22ef96ee7a80b46a8c64b777264fd8f7437d5bbec65a"} err="failed to get container status \"501be451b67bf5bfb57a22ef96ee7a80b46a8c64b777264fd8f7437d5bbec65a\": rpc error: code = NotFound desc = could not find container \"501be451b67bf5bfb57a22ef96ee7a80b46a8c64b777264fd8f7437d5bbec65a\": container with ID starting with 501be451b67bf5bfb57a22ef96ee7a80b46a8c64b777264fd8f7437d5bbec65a not found: ID does not exist" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.309367 4913 scope.go:117] "RemoveContainer" containerID="ae1dd17e2469130c5b492dfbc47384f568c8c2f7ea86fd38715721ab8279629c" Oct 01 13:50:10 crc kubenswrapper[4913]: E1001 13:50:10.309693 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1dd17e2469130c5b492dfbc47384f568c8c2f7ea86fd38715721ab8279629c\": container with ID starting with ae1dd17e2469130c5b492dfbc47384f568c8c2f7ea86fd38715721ab8279629c not found: ID does not exist" containerID="ae1dd17e2469130c5b492dfbc47384f568c8c2f7ea86fd38715721ab8279629c" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.309754 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1dd17e2469130c5b492dfbc47384f568c8c2f7ea86fd38715721ab8279629c"} err="failed to get container status \"ae1dd17e2469130c5b492dfbc47384f568c8c2f7ea86fd38715721ab8279629c\": rpc error: code = NotFound desc = could not find container \"ae1dd17e2469130c5b492dfbc47384f568c8c2f7ea86fd38715721ab8279629c\": container with ID starting with ae1dd17e2469130c5b492dfbc47384f568c8c2f7ea86fd38715721ab8279629c not found: ID does not exist" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.309782 4913 scope.go:117] "RemoveContainer" containerID="762f06672b31088d151649879fcafa133a098b7857528b50b0949c4ab44f64de" Oct 01 13:50:10 crc kubenswrapper[4913]: E1001 13:50:10.310102 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762f06672b31088d151649879fcafa133a098b7857528b50b0949c4ab44f64de\": container with ID starting with 762f06672b31088d151649879fcafa133a098b7857528b50b0949c4ab44f64de not found: ID does not exist" containerID="762f06672b31088d151649879fcafa133a098b7857528b50b0949c4ab44f64de" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.310139 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762f06672b31088d151649879fcafa133a098b7857528b50b0949c4ab44f64de"} err="failed to get container status \"762f06672b31088d151649879fcafa133a098b7857528b50b0949c4ab44f64de\": rpc error: code = NotFound desc = could not find container \"762f06672b31088d151649879fcafa133a098b7857528b50b0949c4ab44f64de\": container with ID starting with 762f06672b31088d151649879fcafa133a098b7857528b50b0949c4ab44f64de not found: ID does not exist" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.812734 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:50:10 crc kubenswrapper[4913]: E1001 13:50:10.813061 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:50:10 crc kubenswrapper[4913]: I1001 13:50:10.820533 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" path="/var/lib/kubelet/pods/97b54ddc-5a2c-4559-bc55-4ea24b8aeb12/volumes" Oct 01 13:50:25 crc kubenswrapper[4913]: I1001 13:50:25.807338 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:50:25 crc kubenswrapper[4913]: E1001 13:50:25.809118 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:50:37 crc kubenswrapper[4913]: I1001 13:50:37.807894 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:50:37 crc kubenswrapper[4913]: E1001 13:50:37.808705 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:50:48 crc kubenswrapper[4913]: I1001 13:50:48.835954 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:50:48 crc kubenswrapper[4913]: E1001 13:50:48.838601 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:51:02 crc kubenswrapper[4913]: I1001 13:51:02.819817 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:51:02 crc kubenswrapper[4913]: E1001 13:51:02.820889 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:51:16 crc kubenswrapper[4913]: I1001 13:51:16.806942 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:51:16 crc kubenswrapper[4913]: E1001 13:51:16.807730 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:51:29 crc kubenswrapper[4913]: I1001 13:51:29.807246 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:51:29 crc kubenswrapper[4913]: E1001 13:51:29.808034 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:51:42 crc kubenswrapper[4913]: I1001 13:51:42.807367 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:51:42 crc kubenswrapper[4913]: E1001 13:51:42.808171 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:51:55 crc kubenswrapper[4913]: I1001 13:51:55.806806 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:51:55 crc kubenswrapper[4913]: E1001 13:51:55.807585 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:52:08 crc kubenswrapper[4913]: I1001 13:52:08.813434 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:52:08 crc kubenswrapper[4913]: E1001 13:52:08.814166 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:52:20 crc kubenswrapper[4913]: I1001 13:52:20.810111 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:52:20 crc kubenswrapper[4913]: E1001 13:52:20.810915 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:52:33 crc kubenswrapper[4913]: I1001 13:52:33.806818 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:52:33 crc kubenswrapper[4913]: E1001 13:52:33.807709 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:52:48 crc kubenswrapper[4913]: I1001 13:52:48.814497 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:52:48 crc kubenswrapper[4913]: E1001 13:52:48.815311 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:52:59 crc kubenswrapper[4913]: I1001 13:52:59.806375 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:52:59 crc kubenswrapper[4913]: E1001 13:52:59.807059 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:53:10 crc kubenswrapper[4913]: I1001 13:53:10.806586 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:53:10 crc kubenswrapper[4913]: E1001 13:53:10.807250 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:53:25 crc kubenswrapper[4913]: I1001 13:53:25.807231 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:53:25 crc kubenswrapper[4913]: E1001 13:53:25.807885 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:53:38 crc kubenswrapper[4913]: I1001 13:53:38.811562 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:53:38 crc kubenswrapper[4913]: E1001 13:53:38.812356 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 13:53:51 crc kubenswrapper[4913]: I1001 13:53:51.809245 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:53:52 crc kubenswrapper[4913]: I1001 13:53:52.116485 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"d0fb00eef97c1c90bb64346fbd42f6941be6f65c7f9a42e83d43243c826553d4"} Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.138514 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wwtw5"] Oct 01 13:54:05 crc kubenswrapper[4913]: E1001 13:54:05.139810 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" containerName="registry-server" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.139835 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" containerName="registry-server" Oct 01 13:54:05 crc kubenswrapper[4913]: E1001 13:54:05.139860 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3569b7-6f25-43de-9543-7456b580ed0b" containerName="extract-content" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.139874 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3569b7-6f25-43de-9543-7456b580ed0b" containerName="extract-content" Oct 01 13:54:05 crc kubenswrapper[4913]: E1001 13:54:05.139895 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" containerName="extract-content" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.139907 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" containerName="extract-content" Oct 01 13:54:05 crc kubenswrapper[4913]: E1001 13:54:05.139929 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3569b7-6f25-43de-9543-7456b580ed0b" containerName="registry-server" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.139939 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3569b7-6f25-43de-9543-7456b580ed0b" containerName="registry-server" Oct 01 13:54:05 crc kubenswrapper[4913]: E1001 13:54:05.139953 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" containerName="extract-utilities" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.139964 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" containerName="extract-utilities" Oct 01 13:54:05 crc kubenswrapper[4913]: E1001 13:54:05.139987 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3569b7-6f25-43de-9543-7456b580ed0b" containerName="extract-utilities" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.139997 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3569b7-6f25-43de-9543-7456b580ed0b" containerName="extract-utilities" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.140398 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3569b7-6f25-43de-9543-7456b580ed0b" containerName="registry-server" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.140433 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="97b54ddc-5a2c-4559-bc55-4ea24b8aeb12" containerName="registry-server" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.143364 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.146390 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wwtw5"] Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.201685 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24cms\" (UniqueName: \"kubernetes.io/projected/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-kube-api-access-24cms\") pod \"community-operators-wwtw5\" (UID: \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\") " pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.202008 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-utilities\") pod \"community-operators-wwtw5\" (UID: \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\") " pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.202372 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-catalog-content\") pod \"community-operators-wwtw5\" (UID: \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\") " pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.303981 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24cms\" (UniqueName: \"kubernetes.io/projected/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-kube-api-access-24cms\") pod \"community-operators-wwtw5\" (UID: \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\") " pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.304066 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-utilities\") pod \"community-operators-wwtw5\" (UID: \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\") " pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.304132 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-catalog-content\") pod \"community-operators-wwtw5\" (UID: \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\") " pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.304875 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-catalog-content\") pod \"community-operators-wwtw5\" (UID: \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\") " pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.305229 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-utilities\") pod \"community-operators-wwtw5\" (UID: \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\") " pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.324424 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24cms\" (UniqueName: \"kubernetes.io/projected/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-kube-api-access-24cms\") pod \"community-operators-wwtw5\" (UID: \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\") " pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.468544 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:05 crc kubenswrapper[4913]: I1001 13:54:05.961680 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wwtw5"] Oct 01 13:54:06 crc kubenswrapper[4913]: I1001 13:54:06.242286 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwtw5" event={"ID":"3f77a630-1b8f-43d9-83f2-e4dd42d714f7","Type":"ContainerStarted","Data":"51c871ffe40e332aa95eb5916744ff59fd515ae4db3bb1b85bbbb0ebd98bd56c"} Oct 01 13:54:07 crc kubenswrapper[4913]: I1001 13:54:07.251769 4913 generic.go:334] "Generic (PLEG): container finished" podID="3f77a630-1b8f-43d9-83f2-e4dd42d714f7" containerID="02be444a3621d5467dd02433a55b597090a926b3f5866f17037ab572b721c495" exitCode=0 Oct 01 13:54:07 crc kubenswrapper[4913]: I1001 13:54:07.251821 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwtw5" event={"ID":"3f77a630-1b8f-43d9-83f2-e4dd42d714f7","Type":"ContainerDied","Data":"02be444a3621d5467dd02433a55b597090a926b3f5866f17037ab572b721c495"} Oct 01 13:54:08 crc kubenswrapper[4913]: I1001 13:54:08.264964 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwtw5" event={"ID":"3f77a630-1b8f-43d9-83f2-e4dd42d714f7","Type":"ContainerStarted","Data":"702b89977e5e1013c4aaa5f566c94acf47c500f3925885547e5747f2ef6972f7"} Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.280848 4913 generic.go:334] "Generic (PLEG): container finished" podID="3f77a630-1b8f-43d9-83f2-e4dd42d714f7" containerID="702b89977e5e1013c4aaa5f566c94acf47c500f3925885547e5747f2ef6972f7" exitCode=0 Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.280929 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwtw5" event={"ID":"3f77a630-1b8f-43d9-83f2-e4dd42d714f7","Type":"ContainerDied","Data":"702b89977e5e1013c4aaa5f566c94acf47c500f3925885547e5747f2ef6972f7"} Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.331685 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7sdvp"] Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.333990 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.344257 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sdvp"] Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.502163 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-catalog-content\") pod \"redhat-marketplace-7sdvp\" (UID: \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\") " pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.502265 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbbgg\" (UniqueName: \"kubernetes.io/projected/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-kube-api-access-hbbgg\") pod \"redhat-marketplace-7sdvp\" (UID: \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\") " pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.502400 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-utilities\") pod \"redhat-marketplace-7sdvp\" (UID: \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\") " pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.605268 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-catalog-content\") pod \"redhat-marketplace-7sdvp\" (UID: \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\") " pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.605408 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbbgg\" (UniqueName: \"kubernetes.io/projected/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-kube-api-access-hbbgg\") pod \"redhat-marketplace-7sdvp\" (UID: \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\") " pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.605475 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-utilities\") pod \"redhat-marketplace-7sdvp\" (UID: \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\") " pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.605673 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-catalog-content\") pod \"redhat-marketplace-7sdvp\" (UID: \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\") " pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.606126 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-utilities\") pod \"redhat-marketplace-7sdvp\" (UID: \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\") " pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.628986 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbbgg\" (UniqueName: \"kubernetes.io/projected/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-kube-api-access-hbbgg\") pod \"redhat-marketplace-7sdvp\" (UID: \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\") " pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:09 crc kubenswrapper[4913]: I1001 13:54:09.665411 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:10 crc kubenswrapper[4913]: I1001 13:54:10.216608 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sdvp"] Oct 01 13:54:10 crc kubenswrapper[4913]: W1001 13:54:10.222265 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod174fa05a_d987_4b33_80ff_b0ac9fb5bb91.slice/crio-2b13b0e48bc5040811a31faf8cd1606cffff9aa116d67c7cfdb166096c7d8484 WatchSource:0}: Error finding container 2b13b0e48bc5040811a31faf8cd1606cffff9aa116d67c7cfdb166096c7d8484: Status 404 returned error can't find the container with id 2b13b0e48bc5040811a31faf8cd1606cffff9aa116d67c7cfdb166096c7d8484 Oct 01 13:54:10 crc kubenswrapper[4913]: I1001 13:54:10.295355 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sdvp" event={"ID":"174fa05a-d987-4b33-80ff-b0ac9fb5bb91","Type":"ContainerStarted","Data":"2b13b0e48bc5040811a31faf8cd1606cffff9aa116d67c7cfdb166096c7d8484"} Oct 01 13:54:10 crc kubenswrapper[4913]: I1001 13:54:10.303903 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwtw5" event={"ID":"3f77a630-1b8f-43d9-83f2-e4dd42d714f7","Type":"ContainerStarted","Data":"43a1f3cb662f5f0f6317929dd724d5e14d0400df0601a7dea476012871225bc6"} Oct 01 13:54:10 crc kubenswrapper[4913]: I1001 13:54:10.322468 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wwtw5" podStartSLOduration=2.8655812149999997 podStartE2EDuration="5.322447213s" podCreationTimestamp="2025-10-01 13:54:05 +0000 UTC" firstStartedPulling="2025-10-01 13:54:07.253962779 +0000 UTC m=+4579.157438357" lastFinishedPulling="2025-10-01 13:54:09.710828777 +0000 UTC m=+4581.614304355" observedRunningTime="2025-10-01 13:54:10.321306041 +0000 UTC m=+4582.224781639" watchObservedRunningTime="2025-10-01 13:54:10.322447213 +0000 UTC m=+4582.225922781" Oct 01 13:54:11 crc kubenswrapper[4913]: I1001 13:54:11.316730 4913 generic.go:334] "Generic (PLEG): container finished" podID="174fa05a-d987-4b33-80ff-b0ac9fb5bb91" containerID="f2301b3c8bba59ff0121208cfc662852ced0f565815aecdc77ffbe3e427826d6" exitCode=0 Oct 01 13:54:11 crc kubenswrapper[4913]: I1001 13:54:11.316786 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sdvp" event={"ID":"174fa05a-d987-4b33-80ff-b0ac9fb5bb91","Type":"ContainerDied","Data":"f2301b3c8bba59ff0121208cfc662852ced0f565815aecdc77ffbe3e427826d6"} Oct 01 13:54:12 crc kubenswrapper[4913]: I1001 13:54:12.333997 4913 generic.go:334] "Generic (PLEG): container finished" podID="174fa05a-d987-4b33-80ff-b0ac9fb5bb91" containerID="601136272d0bc95fd0697a4906bafa5e2732d7eee946c95e8ed815cc8c9def21" exitCode=0 Oct 01 13:54:12 crc kubenswrapper[4913]: I1001 13:54:12.334177 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sdvp" event={"ID":"174fa05a-d987-4b33-80ff-b0ac9fb5bb91","Type":"ContainerDied","Data":"601136272d0bc95fd0697a4906bafa5e2732d7eee946c95e8ed815cc8c9def21"} Oct 01 13:54:14 crc kubenswrapper[4913]: I1001 13:54:14.351418 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sdvp" event={"ID":"174fa05a-d987-4b33-80ff-b0ac9fb5bb91","Type":"ContainerStarted","Data":"4702bb4acc69e06aa76c74974b5c5f7da1a7c3f74d0d8f6d4dab09699ee55b81"} Oct 01 13:54:14 crc kubenswrapper[4913]: I1001 13:54:14.378939 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7sdvp" podStartSLOduration=3.806636979 podStartE2EDuration="5.378917676s" podCreationTimestamp="2025-10-01 13:54:09 +0000 UTC" firstStartedPulling="2025-10-01 13:54:11.318693654 +0000 UTC m=+4583.222169232" lastFinishedPulling="2025-10-01 13:54:12.890974351 +0000 UTC m=+4584.794449929" observedRunningTime="2025-10-01 13:54:14.370711256 +0000 UTC m=+4586.274186864" watchObservedRunningTime="2025-10-01 13:54:14.378917676 +0000 UTC m=+4586.282393264" Oct 01 13:54:15 crc kubenswrapper[4913]: I1001 13:54:15.468849 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:15 crc kubenswrapper[4913]: I1001 13:54:15.469190 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:15 crc kubenswrapper[4913]: I1001 13:54:15.512548 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:16 crc kubenswrapper[4913]: I1001 13:54:16.414849 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:17 crc kubenswrapper[4913]: I1001 13:54:17.327261 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wwtw5"] Oct 01 13:54:18 crc kubenswrapper[4913]: I1001 13:54:18.382968 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wwtw5" podUID="3f77a630-1b8f-43d9-83f2-e4dd42d714f7" containerName="registry-server" containerID="cri-o://43a1f3cb662f5f0f6317929dd724d5e14d0400df0601a7dea476012871225bc6" gracePeriod=2 Oct 01 13:54:18 crc kubenswrapper[4913]: I1001 13:54:18.969818 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.099453 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-catalog-content\") pod \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\" (UID: \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\") " Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.112746 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24cms\" (UniqueName: \"kubernetes.io/projected/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-kube-api-access-24cms\") pod \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\" (UID: \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\") " Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.112892 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-utilities\") pod \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\" (UID: \"3f77a630-1b8f-43d9-83f2-e4dd42d714f7\") " Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.114799 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-utilities" (OuterVolumeSpecName: "utilities") pod "3f77a630-1b8f-43d9-83f2-e4dd42d714f7" (UID: "3f77a630-1b8f-43d9-83f2-e4dd42d714f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.155069 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f77a630-1b8f-43d9-83f2-e4dd42d714f7" (UID: "3f77a630-1b8f-43d9-83f2-e4dd42d714f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.188147 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-kube-api-access-24cms" (OuterVolumeSpecName: "kube-api-access-24cms") pod "3f77a630-1b8f-43d9-83f2-e4dd42d714f7" (UID: "3f77a630-1b8f-43d9-83f2-e4dd42d714f7"). InnerVolumeSpecName "kube-api-access-24cms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.222265 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.222320 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24cms\" (UniqueName: \"kubernetes.io/projected/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-kube-api-access-24cms\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.222332 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f77a630-1b8f-43d9-83f2-e4dd42d714f7-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.395459 4913 generic.go:334] "Generic (PLEG): container finished" podID="3f77a630-1b8f-43d9-83f2-e4dd42d714f7" containerID="43a1f3cb662f5f0f6317929dd724d5e14d0400df0601a7dea476012871225bc6" exitCode=0 Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.395533 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwtw5" event={"ID":"3f77a630-1b8f-43d9-83f2-e4dd42d714f7","Type":"ContainerDied","Data":"43a1f3cb662f5f0f6317929dd724d5e14d0400df0601a7dea476012871225bc6"} Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.395584 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwtw5" event={"ID":"3f77a630-1b8f-43d9-83f2-e4dd42d714f7","Type":"ContainerDied","Data":"51c871ffe40e332aa95eb5916744ff59fd515ae4db3bb1b85bbbb0ebd98bd56c"} Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.395607 4913 scope.go:117] "RemoveContainer" containerID="43a1f3cb662f5f0f6317929dd724d5e14d0400df0601a7dea476012871225bc6" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.395621 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wwtw5" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.419101 4913 scope.go:117] "RemoveContainer" containerID="702b89977e5e1013c4aaa5f566c94acf47c500f3925885547e5747f2ef6972f7" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.449862 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wwtw5"] Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.459574 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wwtw5"] Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.462378 4913 scope.go:117] "RemoveContainer" containerID="02be444a3621d5467dd02433a55b597090a926b3f5866f17037ab572b721c495" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.504001 4913 scope.go:117] "RemoveContainer" containerID="43a1f3cb662f5f0f6317929dd724d5e14d0400df0601a7dea476012871225bc6" Oct 01 13:54:19 crc kubenswrapper[4913]: E1001 13:54:19.504467 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a1f3cb662f5f0f6317929dd724d5e14d0400df0601a7dea476012871225bc6\": container with ID starting with 43a1f3cb662f5f0f6317929dd724d5e14d0400df0601a7dea476012871225bc6 not found: ID does not exist" containerID="43a1f3cb662f5f0f6317929dd724d5e14d0400df0601a7dea476012871225bc6" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.504513 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a1f3cb662f5f0f6317929dd724d5e14d0400df0601a7dea476012871225bc6"} err="failed to get container status \"43a1f3cb662f5f0f6317929dd724d5e14d0400df0601a7dea476012871225bc6\": rpc error: code = NotFound desc = could not find container \"43a1f3cb662f5f0f6317929dd724d5e14d0400df0601a7dea476012871225bc6\": container with ID starting with 43a1f3cb662f5f0f6317929dd724d5e14d0400df0601a7dea476012871225bc6 not found: ID does not exist" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.504567 4913 scope.go:117] "RemoveContainer" containerID="702b89977e5e1013c4aaa5f566c94acf47c500f3925885547e5747f2ef6972f7" Oct 01 13:54:19 crc kubenswrapper[4913]: E1001 13:54:19.505068 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"702b89977e5e1013c4aaa5f566c94acf47c500f3925885547e5747f2ef6972f7\": container with ID starting with 702b89977e5e1013c4aaa5f566c94acf47c500f3925885547e5747f2ef6972f7 not found: ID does not exist" containerID="702b89977e5e1013c4aaa5f566c94acf47c500f3925885547e5747f2ef6972f7" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.505108 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"702b89977e5e1013c4aaa5f566c94acf47c500f3925885547e5747f2ef6972f7"} err="failed to get container status \"702b89977e5e1013c4aaa5f566c94acf47c500f3925885547e5747f2ef6972f7\": rpc error: code = NotFound desc = could not find container \"702b89977e5e1013c4aaa5f566c94acf47c500f3925885547e5747f2ef6972f7\": container with ID starting with 702b89977e5e1013c4aaa5f566c94acf47c500f3925885547e5747f2ef6972f7 not found: ID does not exist" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.505137 4913 scope.go:117] "RemoveContainer" containerID="02be444a3621d5467dd02433a55b597090a926b3f5866f17037ab572b721c495" Oct 01 13:54:19 crc kubenswrapper[4913]: E1001 13:54:19.505651 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02be444a3621d5467dd02433a55b597090a926b3f5866f17037ab572b721c495\": container with ID starting with 02be444a3621d5467dd02433a55b597090a926b3f5866f17037ab572b721c495 not found: ID does not exist" containerID="02be444a3621d5467dd02433a55b597090a926b3f5866f17037ab572b721c495" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.505678 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02be444a3621d5467dd02433a55b597090a926b3f5866f17037ab572b721c495"} err="failed to get container status \"02be444a3621d5467dd02433a55b597090a926b3f5866f17037ab572b721c495\": rpc error: code = NotFound desc = could not find container \"02be444a3621d5467dd02433a55b597090a926b3f5866f17037ab572b721c495\": container with ID starting with 02be444a3621d5467dd02433a55b597090a926b3f5866f17037ab572b721c495 not found: ID does not exist" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.667199 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.667241 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:19 crc kubenswrapper[4913]: I1001 13:54:19.737510 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:20 crc kubenswrapper[4913]: I1001 13:54:20.452932 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:20 crc kubenswrapper[4913]: I1001 13:54:20.822019 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f77a630-1b8f-43d9-83f2-e4dd42d714f7" path="/var/lib/kubelet/pods/3f77a630-1b8f-43d9-83f2-e4dd42d714f7/volumes" Oct 01 13:54:22 crc kubenswrapper[4913]: I1001 13:54:22.123993 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sdvp"] Oct 01 13:54:22 crc kubenswrapper[4913]: I1001 13:54:22.425793 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7sdvp" podUID="174fa05a-d987-4b33-80ff-b0ac9fb5bb91" containerName="registry-server" containerID="cri-o://4702bb4acc69e06aa76c74974b5c5f7da1a7c3f74d0d8f6d4dab09699ee55b81" gracePeriod=2 Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.166671 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.322357 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-catalog-content\") pod \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\" (UID: \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\") " Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.322418 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-utilities\") pod \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\" (UID: \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\") " Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.322743 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbbgg\" (UniqueName: \"kubernetes.io/projected/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-kube-api-access-hbbgg\") pod \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\" (UID: \"174fa05a-d987-4b33-80ff-b0ac9fb5bb91\") " Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.323439 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-utilities" (OuterVolumeSpecName: "utilities") pod "174fa05a-d987-4b33-80ff-b0ac9fb5bb91" (UID: "174fa05a-d987-4b33-80ff-b0ac9fb5bb91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.324221 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.329002 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-kube-api-access-hbbgg" (OuterVolumeSpecName: "kube-api-access-hbbgg") pod "174fa05a-d987-4b33-80ff-b0ac9fb5bb91" (UID: "174fa05a-d987-4b33-80ff-b0ac9fb5bb91"). InnerVolumeSpecName "kube-api-access-hbbgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.336360 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "174fa05a-d987-4b33-80ff-b0ac9fb5bb91" (UID: "174fa05a-d987-4b33-80ff-b0ac9fb5bb91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.426133 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbbgg\" (UniqueName: \"kubernetes.io/projected/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-kube-api-access-hbbgg\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.426171 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174fa05a-d987-4b33-80ff-b0ac9fb5bb91-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.434775 4913 generic.go:334] "Generic (PLEG): container finished" podID="174fa05a-d987-4b33-80ff-b0ac9fb5bb91" containerID="4702bb4acc69e06aa76c74974b5c5f7da1a7c3f74d0d8f6d4dab09699ee55b81" exitCode=0 Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.434822 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sdvp" event={"ID":"174fa05a-d987-4b33-80ff-b0ac9fb5bb91","Type":"ContainerDied","Data":"4702bb4acc69e06aa76c74974b5c5f7da1a7c3f74d0d8f6d4dab09699ee55b81"} Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.434851 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sdvp" event={"ID":"174fa05a-d987-4b33-80ff-b0ac9fb5bb91","Type":"ContainerDied","Data":"2b13b0e48bc5040811a31faf8cd1606cffff9aa116d67c7cfdb166096c7d8484"} Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.434875 4913 scope.go:117] "RemoveContainer" containerID="4702bb4acc69e06aa76c74974b5c5f7da1a7c3f74d0d8f6d4dab09699ee55b81" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.435028 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sdvp" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.459688 4913 scope.go:117] "RemoveContainer" containerID="601136272d0bc95fd0697a4906bafa5e2732d7eee946c95e8ed815cc8c9def21" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.473444 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sdvp"] Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.481320 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sdvp"] Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.489876 4913 scope.go:117] "RemoveContainer" containerID="f2301b3c8bba59ff0121208cfc662852ced0f565815aecdc77ffbe3e427826d6" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.534783 4913 scope.go:117] "RemoveContainer" containerID="4702bb4acc69e06aa76c74974b5c5f7da1a7c3f74d0d8f6d4dab09699ee55b81" Oct 01 13:54:23 crc kubenswrapper[4913]: E1001 13:54:23.535513 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4702bb4acc69e06aa76c74974b5c5f7da1a7c3f74d0d8f6d4dab09699ee55b81\": container with ID starting with 4702bb4acc69e06aa76c74974b5c5f7da1a7c3f74d0d8f6d4dab09699ee55b81 not found: ID does not exist" containerID="4702bb4acc69e06aa76c74974b5c5f7da1a7c3f74d0d8f6d4dab09699ee55b81" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.535571 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4702bb4acc69e06aa76c74974b5c5f7da1a7c3f74d0d8f6d4dab09699ee55b81"} err="failed to get container status \"4702bb4acc69e06aa76c74974b5c5f7da1a7c3f74d0d8f6d4dab09699ee55b81\": rpc error: code = NotFound desc = could not find container \"4702bb4acc69e06aa76c74974b5c5f7da1a7c3f74d0d8f6d4dab09699ee55b81\": container with ID starting with 4702bb4acc69e06aa76c74974b5c5f7da1a7c3f74d0d8f6d4dab09699ee55b81 not found: ID does not exist" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.535606 4913 scope.go:117] "RemoveContainer" containerID="601136272d0bc95fd0697a4906bafa5e2732d7eee946c95e8ed815cc8c9def21" Oct 01 13:54:23 crc kubenswrapper[4913]: E1001 13:54:23.536098 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"601136272d0bc95fd0697a4906bafa5e2732d7eee946c95e8ed815cc8c9def21\": container with ID starting with 601136272d0bc95fd0697a4906bafa5e2732d7eee946c95e8ed815cc8c9def21 not found: ID does not exist" containerID="601136272d0bc95fd0697a4906bafa5e2732d7eee946c95e8ed815cc8c9def21" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.536132 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"601136272d0bc95fd0697a4906bafa5e2732d7eee946c95e8ed815cc8c9def21"} err="failed to get container status \"601136272d0bc95fd0697a4906bafa5e2732d7eee946c95e8ed815cc8c9def21\": rpc error: code = NotFound desc = could not find container \"601136272d0bc95fd0697a4906bafa5e2732d7eee946c95e8ed815cc8c9def21\": container with ID starting with 601136272d0bc95fd0697a4906bafa5e2732d7eee946c95e8ed815cc8c9def21 not found: ID does not exist" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.536155 4913 scope.go:117] "RemoveContainer" containerID="f2301b3c8bba59ff0121208cfc662852ced0f565815aecdc77ffbe3e427826d6" Oct 01 13:54:23 crc kubenswrapper[4913]: E1001 13:54:23.536414 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2301b3c8bba59ff0121208cfc662852ced0f565815aecdc77ffbe3e427826d6\": container with ID starting with f2301b3c8bba59ff0121208cfc662852ced0f565815aecdc77ffbe3e427826d6 not found: ID does not exist" containerID="f2301b3c8bba59ff0121208cfc662852ced0f565815aecdc77ffbe3e427826d6" Oct 01 13:54:23 crc kubenswrapper[4913]: I1001 13:54:23.536476 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2301b3c8bba59ff0121208cfc662852ced0f565815aecdc77ffbe3e427826d6"} err="failed to get container status \"f2301b3c8bba59ff0121208cfc662852ced0f565815aecdc77ffbe3e427826d6\": rpc error: code = NotFound desc = could not find container \"f2301b3c8bba59ff0121208cfc662852ced0f565815aecdc77ffbe3e427826d6\": container with ID starting with f2301b3c8bba59ff0121208cfc662852ced0f565815aecdc77ffbe3e427826d6 not found: ID does not exist" Oct 01 13:54:24 crc kubenswrapper[4913]: I1001 13:54:24.817978 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="174fa05a-d987-4b33-80ff-b0ac9fb5bb91" path="/var/lib/kubelet/pods/174fa05a-d987-4b33-80ff-b0ac9fb5bb91/volumes" Oct 01 13:56:10 crc kubenswrapper[4913]: I1001 13:56:10.083915 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:56:10 crc kubenswrapper[4913]: I1001 13:56:10.085013 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:56:40 crc kubenswrapper[4913]: I1001 13:56:40.083975 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:56:40 crc kubenswrapper[4913]: I1001 13:56:40.084481 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:57:10 crc kubenswrapper[4913]: I1001 13:57:10.083448 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:57:10 crc kubenswrapper[4913]: I1001 13:57:10.084059 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:57:10 crc kubenswrapper[4913]: I1001 13:57:10.084107 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 13:57:10 crc kubenswrapper[4913]: I1001 13:57:10.084909 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d0fb00eef97c1c90bb64346fbd42f6941be6f65c7f9a42e83d43243c826553d4"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:57:10 crc kubenswrapper[4913]: I1001 13:57:10.084969 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://d0fb00eef97c1c90bb64346fbd42f6941be6f65c7f9a42e83d43243c826553d4" gracePeriod=600 Oct 01 13:57:10 crc kubenswrapper[4913]: I1001 13:57:10.883408 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="d0fb00eef97c1c90bb64346fbd42f6941be6f65c7f9a42e83d43243c826553d4" exitCode=0 Oct 01 13:57:10 crc kubenswrapper[4913]: I1001 13:57:10.883509 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"d0fb00eef97c1c90bb64346fbd42f6941be6f65c7f9a42e83d43243c826553d4"} Oct 01 13:57:10 crc kubenswrapper[4913]: I1001 13:57:10.884222 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379"} Oct 01 13:57:10 crc kubenswrapper[4913]: I1001 13:57:10.884248 4913 scope.go:117] "RemoveContainer" containerID="0e45325547d25cf60e7b5ba50d0517717fba0328b0a0391e30258e7b6ddc6e01" Oct 01 13:59:10 crc kubenswrapper[4913]: I1001 13:59:10.083531 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:59:10 crc kubenswrapper[4913]: I1001 13:59:10.084065 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:59:40 crc kubenswrapper[4913]: I1001 13:59:40.084382 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:59:40 crc kubenswrapper[4913]: I1001 13:59:40.084902 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.144873 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh"] Oct 01 14:00:00 crc kubenswrapper[4913]: E1001 14:00:00.146560 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f77a630-1b8f-43d9-83f2-e4dd42d714f7" containerName="extract-content" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.146579 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f77a630-1b8f-43d9-83f2-e4dd42d714f7" containerName="extract-content" Oct 01 14:00:00 crc kubenswrapper[4913]: E1001 14:00:00.146637 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f77a630-1b8f-43d9-83f2-e4dd42d714f7" containerName="extract-utilities" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.146647 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f77a630-1b8f-43d9-83f2-e4dd42d714f7" containerName="extract-utilities" Oct 01 14:00:00 crc kubenswrapper[4913]: E1001 14:00:00.146673 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f77a630-1b8f-43d9-83f2-e4dd42d714f7" containerName="registry-server" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.146684 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f77a630-1b8f-43d9-83f2-e4dd42d714f7" containerName="registry-server" Oct 01 14:00:00 crc kubenswrapper[4913]: E1001 14:00:00.146722 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174fa05a-d987-4b33-80ff-b0ac9fb5bb91" containerName="registry-server" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.146731 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="174fa05a-d987-4b33-80ff-b0ac9fb5bb91" containerName="registry-server" Oct 01 14:00:00 crc kubenswrapper[4913]: E1001 14:00:00.146756 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174fa05a-d987-4b33-80ff-b0ac9fb5bb91" containerName="extract-content" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.146766 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="174fa05a-d987-4b33-80ff-b0ac9fb5bb91" containerName="extract-content" Oct 01 14:00:00 crc kubenswrapper[4913]: E1001 14:00:00.146800 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174fa05a-d987-4b33-80ff-b0ac9fb5bb91" containerName="extract-utilities" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.146809 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="174fa05a-d987-4b33-80ff-b0ac9fb5bb91" containerName="extract-utilities" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.147366 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f77a630-1b8f-43d9-83f2-e4dd42d714f7" containerName="registry-server" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.147407 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="174fa05a-d987-4b33-80ff-b0ac9fb5bb91" containerName="registry-server" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.148511 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.154014 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.151838 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.177920 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh"] Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.238602 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4236a171-5b8c-4150-8f3f-dee472fd9e0a-config-volume\") pod \"collect-profiles-29322120-mzjjh\" (UID: \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.238859 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lkhv\" (UniqueName: \"kubernetes.io/projected/4236a171-5b8c-4150-8f3f-dee472fd9e0a-kube-api-access-5lkhv\") pod \"collect-profiles-29322120-mzjjh\" (UID: \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.238899 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4236a171-5b8c-4150-8f3f-dee472fd9e0a-secret-volume\") pod \"collect-profiles-29322120-mzjjh\" (UID: \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.340512 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4236a171-5b8c-4150-8f3f-dee472fd9e0a-config-volume\") pod \"collect-profiles-29322120-mzjjh\" (UID: \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.340725 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lkhv\" (UniqueName: \"kubernetes.io/projected/4236a171-5b8c-4150-8f3f-dee472fd9e0a-kube-api-access-5lkhv\") pod \"collect-profiles-29322120-mzjjh\" (UID: \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.340768 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4236a171-5b8c-4150-8f3f-dee472fd9e0a-secret-volume\") pod \"collect-profiles-29322120-mzjjh\" (UID: \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.341668 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4236a171-5b8c-4150-8f3f-dee472fd9e0a-config-volume\") pod \"collect-profiles-29322120-mzjjh\" (UID: \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.352786 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4236a171-5b8c-4150-8f3f-dee472fd9e0a-secret-volume\") pod \"collect-profiles-29322120-mzjjh\" (UID: \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.355476 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lkhv\" (UniqueName: \"kubernetes.io/projected/4236a171-5b8c-4150-8f3f-dee472fd9e0a-kube-api-access-5lkhv\") pod \"collect-profiles-29322120-mzjjh\" (UID: \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" Oct 01 14:00:00 crc kubenswrapper[4913]: I1001 14:00:00.492863 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" Oct 01 14:00:01 crc kubenswrapper[4913]: I1001 14:00:01.002563 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh"] Oct 01 14:00:02 crc kubenswrapper[4913]: I1001 14:00:02.390607 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" event={"ID":"4236a171-5b8c-4150-8f3f-dee472fd9e0a","Type":"ContainerStarted","Data":"9ab28a08e822377d28dadf6169e206d71c54eb3c6eba1cb69d5ab9a8c7af52b9"} Oct 01 14:00:02 crc kubenswrapper[4913]: I1001 14:00:02.390980 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" event={"ID":"4236a171-5b8c-4150-8f3f-dee472fd9e0a","Type":"ContainerStarted","Data":"b97866ab918f29f4f9f618e08d9ee76752ad92afe996955e203b66daa146748d"} Oct 01 14:00:02 crc kubenswrapper[4913]: I1001 14:00:02.408218 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" podStartSLOduration=2.408155583 podStartE2EDuration="2.408155583s" podCreationTimestamp="2025-10-01 14:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:00:02.405764646 +0000 UTC m=+4934.309240224" watchObservedRunningTime="2025-10-01 14:00:02.408155583 +0000 UTC m=+4934.311631161" Oct 01 14:00:03 crc kubenswrapper[4913]: I1001 14:00:03.401181 4913 generic.go:334] "Generic (PLEG): container finished" podID="4236a171-5b8c-4150-8f3f-dee472fd9e0a" containerID="9ab28a08e822377d28dadf6169e206d71c54eb3c6eba1cb69d5ab9a8c7af52b9" exitCode=0 Oct 01 14:00:03 crc kubenswrapper[4913]: I1001 14:00:03.401346 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" event={"ID":"4236a171-5b8c-4150-8f3f-dee472fd9e0a","Type":"ContainerDied","Data":"9ab28a08e822377d28dadf6169e206d71c54eb3c6eba1cb69d5ab9a8c7af52b9"} Oct 01 14:00:04 crc kubenswrapper[4913]: I1001 14:00:04.865833 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" Oct 01 14:00:05 crc kubenswrapper[4913]: I1001 14:00:05.044591 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4236a171-5b8c-4150-8f3f-dee472fd9e0a-secret-volume\") pod \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\" (UID: \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\") " Oct 01 14:00:05 crc kubenswrapper[4913]: I1001 14:00:05.044897 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lkhv\" (UniqueName: \"kubernetes.io/projected/4236a171-5b8c-4150-8f3f-dee472fd9e0a-kube-api-access-5lkhv\") pod \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\" (UID: \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\") " Oct 01 14:00:05 crc kubenswrapper[4913]: I1001 14:00:05.045094 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4236a171-5b8c-4150-8f3f-dee472fd9e0a-config-volume\") pod \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\" (UID: \"4236a171-5b8c-4150-8f3f-dee472fd9e0a\") " Oct 01 14:00:05 crc kubenswrapper[4913]: I1001 14:00:05.045973 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4236a171-5b8c-4150-8f3f-dee472fd9e0a-config-volume" (OuterVolumeSpecName: "config-volume") pod "4236a171-5b8c-4150-8f3f-dee472fd9e0a" (UID: "4236a171-5b8c-4150-8f3f-dee472fd9e0a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:00:05 crc kubenswrapper[4913]: I1001 14:00:05.051287 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4236a171-5b8c-4150-8f3f-dee472fd9e0a-kube-api-access-5lkhv" (OuterVolumeSpecName: "kube-api-access-5lkhv") pod "4236a171-5b8c-4150-8f3f-dee472fd9e0a" (UID: "4236a171-5b8c-4150-8f3f-dee472fd9e0a"). InnerVolumeSpecName "kube-api-access-5lkhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:00:05 crc kubenswrapper[4913]: I1001 14:00:05.051945 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4236a171-5b8c-4150-8f3f-dee472fd9e0a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4236a171-5b8c-4150-8f3f-dee472fd9e0a" (UID: "4236a171-5b8c-4150-8f3f-dee472fd9e0a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:00:05 crc kubenswrapper[4913]: I1001 14:00:05.147469 4913 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4236a171-5b8c-4150-8f3f-dee472fd9e0a-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:05 crc kubenswrapper[4913]: I1001 14:00:05.147519 4913 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4236a171-5b8c-4150-8f3f-dee472fd9e0a-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:05 crc kubenswrapper[4913]: I1001 14:00:05.147532 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lkhv\" (UniqueName: \"kubernetes.io/projected/4236a171-5b8c-4150-8f3f-dee472fd9e0a-kube-api-access-5lkhv\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:05 crc kubenswrapper[4913]: I1001 14:00:05.417680 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" event={"ID":"4236a171-5b8c-4150-8f3f-dee472fd9e0a","Type":"ContainerDied","Data":"b97866ab918f29f4f9f618e08d9ee76752ad92afe996955e203b66daa146748d"} Oct 01 14:00:05 crc kubenswrapper[4913]: I1001 14:00:05.417717 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b97866ab918f29f4f9f618e08d9ee76752ad92afe996955e203b66daa146748d" Oct 01 14:00:05 crc kubenswrapper[4913]: I1001 14:00:05.417758 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-mzjjh" Oct 01 14:00:05 crc kubenswrapper[4913]: I1001 14:00:05.481660 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj"] Oct 01 14:00:05 crc kubenswrapper[4913]: I1001 14:00:05.491142 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-c6nwj"] Oct 01 14:00:06 crc kubenswrapper[4913]: I1001 14:00:06.817784 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02fddc88-0d2f-4339-acc7-b2606d785b76" path="/var/lib/kubelet/pods/02fddc88-0d2f-4339-acc7-b2606d785b76/volumes" Oct 01 14:00:08 crc kubenswrapper[4913]: I1001 14:00:08.599498 4913 scope.go:117] "RemoveContainer" containerID="a6e30bc4da301abfb999bc669a79e68e7d6016a20833712e66828bd2b8b85923" Oct 01 14:00:10 crc kubenswrapper[4913]: I1001 14:00:10.083867 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:00:10 crc kubenswrapper[4913]: I1001 14:00:10.084484 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:00:10 crc kubenswrapper[4913]: I1001 14:00:10.084544 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 14:00:10 crc kubenswrapper[4913]: I1001 14:00:10.085470 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:00:10 crc kubenswrapper[4913]: I1001 14:00:10.085538 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" gracePeriod=600 Oct 01 14:00:10 crc kubenswrapper[4913]: E1001 14:00:10.227528 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:00:10 crc kubenswrapper[4913]: I1001 14:00:10.462091 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" exitCode=0 Oct 01 14:00:10 crc kubenswrapper[4913]: I1001 14:00:10.462144 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379"} Oct 01 14:00:10 crc kubenswrapper[4913]: I1001 14:00:10.462181 4913 scope.go:117] "RemoveContainer" containerID="d0fb00eef97c1c90bb64346fbd42f6941be6f65c7f9a42e83d43243c826553d4" Oct 01 14:00:10 crc kubenswrapper[4913]: I1001 14:00:10.463347 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:00:10 crc kubenswrapper[4913]: E1001 14:00:10.463974 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:00:23 crc kubenswrapper[4913]: I1001 14:00:23.807236 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:00:23 crc kubenswrapper[4913]: E1001 14:00:23.808073 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.569397 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bdldd"] Oct 01 14:00:24 crc kubenswrapper[4913]: E1001 14:00:24.569988 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4236a171-5b8c-4150-8f3f-dee472fd9e0a" containerName="collect-profiles" Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.570016 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="4236a171-5b8c-4150-8f3f-dee472fd9e0a" containerName="collect-profiles" Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.570240 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="4236a171-5b8c-4150-8f3f-dee472fd9e0a" containerName="collect-profiles" Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.571909 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.581063 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bdldd"] Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.651589 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jwmj\" (UniqueName: \"kubernetes.io/projected/80210be9-64e3-466f-9cf7-4333365021bf-kube-api-access-5jwmj\") pod \"redhat-operators-bdldd\" (UID: \"80210be9-64e3-466f-9cf7-4333365021bf\") " pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.651669 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80210be9-64e3-466f-9cf7-4333365021bf-catalog-content\") pod \"redhat-operators-bdldd\" (UID: \"80210be9-64e3-466f-9cf7-4333365021bf\") " pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.652068 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80210be9-64e3-466f-9cf7-4333365021bf-utilities\") pod \"redhat-operators-bdldd\" (UID: \"80210be9-64e3-466f-9cf7-4333365021bf\") " pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.753756 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jwmj\" (UniqueName: \"kubernetes.io/projected/80210be9-64e3-466f-9cf7-4333365021bf-kube-api-access-5jwmj\") pod \"redhat-operators-bdldd\" (UID: \"80210be9-64e3-466f-9cf7-4333365021bf\") " pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.753819 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80210be9-64e3-466f-9cf7-4333365021bf-catalog-content\") pod \"redhat-operators-bdldd\" (UID: \"80210be9-64e3-466f-9cf7-4333365021bf\") " pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.753936 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80210be9-64e3-466f-9cf7-4333365021bf-utilities\") pod \"redhat-operators-bdldd\" (UID: \"80210be9-64e3-466f-9cf7-4333365021bf\") " pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.754429 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80210be9-64e3-466f-9cf7-4333365021bf-catalog-content\") pod \"redhat-operators-bdldd\" (UID: \"80210be9-64e3-466f-9cf7-4333365021bf\") " pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.754449 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80210be9-64e3-466f-9cf7-4333365021bf-utilities\") pod \"redhat-operators-bdldd\" (UID: \"80210be9-64e3-466f-9cf7-4333365021bf\") " pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.773396 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jwmj\" (UniqueName: \"kubernetes.io/projected/80210be9-64e3-466f-9cf7-4333365021bf-kube-api-access-5jwmj\") pod \"redhat-operators-bdldd\" (UID: \"80210be9-64e3-466f-9cf7-4333365021bf\") " pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:24 crc kubenswrapper[4913]: I1001 14:00:24.898477 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:25 crc kubenswrapper[4913]: I1001 14:00:25.493172 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bdldd"] Oct 01 14:00:25 crc kubenswrapper[4913]: I1001 14:00:25.585976 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdldd" event={"ID":"80210be9-64e3-466f-9cf7-4333365021bf","Type":"ContainerStarted","Data":"40ef104222820ab767ccb84402902719b1103012b7e5366971e1bb8de33eef84"} Oct 01 14:00:26 crc kubenswrapper[4913]: I1001 14:00:26.595738 4913 generic.go:334] "Generic (PLEG): container finished" podID="80210be9-64e3-466f-9cf7-4333365021bf" containerID="7bc58fea122bb542a029b22ab7fca9ed5b0bc3f5ebfcd7fd763a62ca4e88625c" exitCode=0 Oct 01 14:00:26 crc kubenswrapper[4913]: I1001 14:00:26.595791 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdldd" event={"ID":"80210be9-64e3-466f-9cf7-4333365021bf","Type":"ContainerDied","Data":"7bc58fea122bb542a029b22ab7fca9ed5b0bc3f5ebfcd7fd763a62ca4e88625c"} Oct 01 14:00:26 crc kubenswrapper[4913]: I1001 14:00:26.597865 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:00:27 crc kubenswrapper[4913]: I1001 14:00:27.608726 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdldd" event={"ID":"80210be9-64e3-466f-9cf7-4333365021bf","Type":"ContainerStarted","Data":"38559a0f7d15aedb9d26895ee6469afa88e759f9e7a4b2df0bf09a962561fd31"} Oct 01 14:00:31 crc kubenswrapper[4913]: I1001 14:00:31.639879 4913 generic.go:334] "Generic (PLEG): container finished" podID="80210be9-64e3-466f-9cf7-4333365021bf" containerID="38559a0f7d15aedb9d26895ee6469afa88e759f9e7a4b2df0bf09a962561fd31" exitCode=0 Oct 01 14:00:31 crc kubenswrapper[4913]: I1001 14:00:31.639956 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdldd" event={"ID":"80210be9-64e3-466f-9cf7-4333365021bf","Type":"ContainerDied","Data":"38559a0f7d15aedb9d26895ee6469afa88e759f9e7a4b2df0bf09a962561fd31"} Oct 01 14:00:32 crc kubenswrapper[4913]: I1001 14:00:32.650762 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdldd" event={"ID":"80210be9-64e3-466f-9cf7-4333365021bf","Type":"ContainerStarted","Data":"fd7c0d1102e8fad20e1aa5e9dd6f407bff29d5f0d8e96fff2041ef02f917d89b"} Oct 01 14:00:32 crc kubenswrapper[4913]: I1001 14:00:32.671423 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bdldd" podStartSLOduration=3.154956621 podStartE2EDuration="8.67140436s" podCreationTimestamp="2025-10-01 14:00:24 +0000 UTC" firstStartedPulling="2025-10-01 14:00:26.597625808 +0000 UTC m=+4958.501101386" lastFinishedPulling="2025-10-01 14:00:32.114073537 +0000 UTC m=+4964.017549125" observedRunningTime="2025-10-01 14:00:32.667448998 +0000 UTC m=+4964.570924586" watchObservedRunningTime="2025-10-01 14:00:32.67140436 +0000 UTC m=+4964.574879948" Oct 01 14:00:34 crc kubenswrapper[4913]: I1001 14:00:34.899555 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:34 crc kubenswrapper[4913]: I1001 14:00:34.899867 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:35 crc kubenswrapper[4913]: I1001 14:00:35.949389 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bdldd" podUID="80210be9-64e3-466f-9cf7-4333365021bf" containerName="registry-server" probeResult="failure" output=< Oct 01 14:00:35 crc kubenswrapper[4913]: timeout: failed to connect service ":50051" within 1s Oct 01 14:00:35 crc kubenswrapper[4913]: > Oct 01 14:00:38 crc kubenswrapper[4913]: I1001 14:00:38.813087 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:00:38 crc kubenswrapper[4913]: E1001 14:00:38.813841 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:00:44 crc kubenswrapper[4913]: I1001 14:00:44.948151 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:45 crc kubenswrapper[4913]: I1001 14:00:45.004184 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:45 crc kubenswrapper[4913]: I1001 14:00:45.186568 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bdldd"] Oct 01 14:00:46 crc kubenswrapper[4913]: I1001 14:00:46.774363 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bdldd" podUID="80210be9-64e3-466f-9cf7-4333365021bf" containerName="registry-server" containerID="cri-o://fd7c0d1102e8fad20e1aa5e9dd6f407bff29d5f0d8e96fff2041ef02f917d89b" gracePeriod=2 Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.355732 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.483792 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jwmj\" (UniqueName: \"kubernetes.io/projected/80210be9-64e3-466f-9cf7-4333365021bf-kube-api-access-5jwmj\") pod \"80210be9-64e3-466f-9cf7-4333365021bf\" (UID: \"80210be9-64e3-466f-9cf7-4333365021bf\") " Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.483928 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80210be9-64e3-466f-9cf7-4333365021bf-catalog-content\") pod \"80210be9-64e3-466f-9cf7-4333365021bf\" (UID: \"80210be9-64e3-466f-9cf7-4333365021bf\") " Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.484005 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80210be9-64e3-466f-9cf7-4333365021bf-utilities\") pod \"80210be9-64e3-466f-9cf7-4333365021bf\" (UID: \"80210be9-64e3-466f-9cf7-4333365021bf\") " Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.484646 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80210be9-64e3-466f-9cf7-4333365021bf-utilities" (OuterVolumeSpecName: "utilities") pod "80210be9-64e3-466f-9cf7-4333365021bf" (UID: "80210be9-64e3-466f-9cf7-4333365021bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.485025 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80210be9-64e3-466f-9cf7-4333365021bf-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.489076 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80210be9-64e3-466f-9cf7-4333365021bf-kube-api-access-5jwmj" (OuterVolumeSpecName: "kube-api-access-5jwmj") pod "80210be9-64e3-466f-9cf7-4333365021bf" (UID: "80210be9-64e3-466f-9cf7-4333365021bf"). InnerVolumeSpecName "kube-api-access-5jwmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.561639 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80210be9-64e3-466f-9cf7-4333365021bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80210be9-64e3-466f-9cf7-4333365021bf" (UID: "80210be9-64e3-466f-9cf7-4333365021bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.586967 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jwmj\" (UniqueName: \"kubernetes.io/projected/80210be9-64e3-466f-9cf7-4333365021bf-kube-api-access-5jwmj\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.587201 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80210be9-64e3-466f-9cf7-4333365021bf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.784447 4913 generic.go:334] "Generic (PLEG): container finished" podID="80210be9-64e3-466f-9cf7-4333365021bf" containerID="fd7c0d1102e8fad20e1aa5e9dd6f407bff29d5f0d8e96fff2041ef02f917d89b" exitCode=0 Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.784530 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdldd" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.784552 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdldd" event={"ID":"80210be9-64e3-466f-9cf7-4333365021bf","Type":"ContainerDied","Data":"fd7c0d1102e8fad20e1aa5e9dd6f407bff29d5f0d8e96fff2041ef02f917d89b"} Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.784874 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdldd" event={"ID":"80210be9-64e3-466f-9cf7-4333365021bf","Type":"ContainerDied","Data":"40ef104222820ab767ccb84402902719b1103012b7e5366971e1bb8de33eef84"} Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.784896 4913 scope.go:117] "RemoveContainer" containerID="fd7c0d1102e8fad20e1aa5e9dd6f407bff29d5f0d8e96fff2041ef02f917d89b" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.820011 4913 scope.go:117] "RemoveContainer" containerID="38559a0f7d15aedb9d26895ee6469afa88e759f9e7a4b2df0bf09a962561fd31" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.820132 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bdldd"] Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.827591 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bdldd"] Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.848502 4913 scope.go:117] "RemoveContainer" containerID="7bc58fea122bb542a029b22ab7fca9ed5b0bc3f5ebfcd7fd763a62ca4e88625c" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.897109 4913 scope.go:117] "RemoveContainer" containerID="fd7c0d1102e8fad20e1aa5e9dd6f407bff29d5f0d8e96fff2041ef02f917d89b" Oct 01 14:00:47 crc kubenswrapper[4913]: E1001 14:00:47.897626 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd7c0d1102e8fad20e1aa5e9dd6f407bff29d5f0d8e96fff2041ef02f917d89b\": container with ID starting with fd7c0d1102e8fad20e1aa5e9dd6f407bff29d5f0d8e96fff2041ef02f917d89b not found: ID does not exist" containerID="fd7c0d1102e8fad20e1aa5e9dd6f407bff29d5f0d8e96fff2041ef02f917d89b" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.897662 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd7c0d1102e8fad20e1aa5e9dd6f407bff29d5f0d8e96fff2041ef02f917d89b"} err="failed to get container status \"fd7c0d1102e8fad20e1aa5e9dd6f407bff29d5f0d8e96fff2041ef02f917d89b\": rpc error: code = NotFound desc = could not find container \"fd7c0d1102e8fad20e1aa5e9dd6f407bff29d5f0d8e96fff2041ef02f917d89b\": container with ID starting with fd7c0d1102e8fad20e1aa5e9dd6f407bff29d5f0d8e96fff2041ef02f917d89b not found: ID does not exist" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.897686 4913 scope.go:117] "RemoveContainer" containerID="38559a0f7d15aedb9d26895ee6469afa88e759f9e7a4b2df0bf09a962561fd31" Oct 01 14:00:47 crc kubenswrapper[4913]: E1001 14:00:47.898073 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38559a0f7d15aedb9d26895ee6469afa88e759f9e7a4b2df0bf09a962561fd31\": container with ID starting with 38559a0f7d15aedb9d26895ee6469afa88e759f9e7a4b2df0bf09a962561fd31 not found: ID does not exist" containerID="38559a0f7d15aedb9d26895ee6469afa88e759f9e7a4b2df0bf09a962561fd31" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.898092 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38559a0f7d15aedb9d26895ee6469afa88e759f9e7a4b2df0bf09a962561fd31"} err="failed to get container status \"38559a0f7d15aedb9d26895ee6469afa88e759f9e7a4b2df0bf09a962561fd31\": rpc error: code = NotFound desc = could not find container \"38559a0f7d15aedb9d26895ee6469afa88e759f9e7a4b2df0bf09a962561fd31\": container with ID starting with 38559a0f7d15aedb9d26895ee6469afa88e759f9e7a4b2df0bf09a962561fd31 not found: ID does not exist" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.898106 4913 scope.go:117] "RemoveContainer" containerID="7bc58fea122bb542a029b22ab7fca9ed5b0bc3f5ebfcd7fd763a62ca4e88625c" Oct 01 14:00:47 crc kubenswrapper[4913]: E1001 14:00:47.898380 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bc58fea122bb542a029b22ab7fca9ed5b0bc3f5ebfcd7fd763a62ca4e88625c\": container with ID starting with 7bc58fea122bb542a029b22ab7fca9ed5b0bc3f5ebfcd7fd763a62ca4e88625c not found: ID does not exist" containerID="7bc58fea122bb542a029b22ab7fca9ed5b0bc3f5ebfcd7fd763a62ca4e88625c" Oct 01 14:00:47 crc kubenswrapper[4913]: I1001 14:00:47.898400 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc58fea122bb542a029b22ab7fca9ed5b0bc3f5ebfcd7fd763a62ca4e88625c"} err="failed to get container status \"7bc58fea122bb542a029b22ab7fca9ed5b0bc3f5ebfcd7fd763a62ca4e88625c\": rpc error: code = NotFound desc = could not find container \"7bc58fea122bb542a029b22ab7fca9ed5b0bc3f5ebfcd7fd763a62ca4e88625c\": container with ID starting with 7bc58fea122bb542a029b22ab7fca9ed5b0bc3f5ebfcd7fd763a62ca4e88625c not found: ID does not exist" Oct 01 14:00:48 crc kubenswrapper[4913]: I1001 14:00:48.820959 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80210be9-64e3-466f-9cf7-4333365021bf" path="/var/lib/kubelet/pods/80210be9-64e3-466f-9cf7-4333365021bf/volumes" Oct 01 14:00:50 crc kubenswrapper[4913]: I1001 14:00:50.807526 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:00:50 crc kubenswrapper[4913]: E1001 14:00:50.808428 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.137869 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29322121-gg2t8"] Oct 01 14:01:00 crc kubenswrapper[4913]: E1001 14:01:00.138754 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80210be9-64e3-466f-9cf7-4333365021bf" containerName="extract-utilities" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.138767 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="80210be9-64e3-466f-9cf7-4333365021bf" containerName="extract-utilities" Oct 01 14:01:00 crc kubenswrapper[4913]: E1001 14:01:00.138788 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80210be9-64e3-466f-9cf7-4333365021bf" containerName="extract-content" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.138794 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="80210be9-64e3-466f-9cf7-4333365021bf" containerName="extract-content" Oct 01 14:01:00 crc kubenswrapper[4913]: E1001 14:01:00.138808 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80210be9-64e3-466f-9cf7-4333365021bf" containerName="registry-server" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.138815 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="80210be9-64e3-466f-9cf7-4333365021bf" containerName="registry-server" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.138984 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="80210be9-64e3-466f-9cf7-4333365021bf" containerName="registry-server" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.139638 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.154646 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322121-gg2t8"] Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.341019 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-fernet-keys\") pod \"keystone-cron-29322121-gg2t8\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.341109 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hx98\" (UniqueName: \"kubernetes.io/projected/f06702e3-f9e7-4e79-99d7-24b3203a1051-kube-api-access-2hx98\") pod \"keystone-cron-29322121-gg2t8\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.341136 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-config-data\") pod \"keystone-cron-29322121-gg2t8\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.342005 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-combined-ca-bundle\") pod \"keystone-cron-29322121-gg2t8\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.443098 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-combined-ca-bundle\") pod \"keystone-cron-29322121-gg2t8\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.443180 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-fernet-keys\") pod \"keystone-cron-29322121-gg2t8\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.443236 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hx98\" (UniqueName: \"kubernetes.io/projected/f06702e3-f9e7-4e79-99d7-24b3203a1051-kube-api-access-2hx98\") pod \"keystone-cron-29322121-gg2t8\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.443255 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-config-data\") pod \"keystone-cron-29322121-gg2t8\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.497538 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-fernet-keys\") pod \"keystone-cron-29322121-gg2t8\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.498198 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-combined-ca-bundle\") pod \"keystone-cron-29322121-gg2t8\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.501970 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hx98\" (UniqueName: \"kubernetes.io/projected/f06702e3-f9e7-4e79-99d7-24b3203a1051-kube-api-access-2hx98\") pod \"keystone-cron-29322121-gg2t8\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.508811 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-config-data\") pod \"keystone-cron-29322121-gg2t8\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:00 crc kubenswrapper[4913]: I1001 14:01:00.758848 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:01 crc kubenswrapper[4913]: I1001 14:01:01.225640 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322121-gg2t8"] Oct 01 14:01:01 crc kubenswrapper[4913]: I1001 14:01:01.912344 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322121-gg2t8" event={"ID":"f06702e3-f9e7-4e79-99d7-24b3203a1051","Type":"ContainerStarted","Data":"655ed20a5d94dcdc8767cd90e1ce2c4042d9200e01329cca1cc68be6ae16a917"} Oct 01 14:01:01 crc kubenswrapper[4913]: I1001 14:01:01.912697 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322121-gg2t8" event={"ID":"f06702e3-f9e7-4e79-99d7-24b3203a1051","Type":"ContainerStarted","Data":"43fbfd0d2fa8432a128b1b6c6cbd148720178c0c3dd492c9162fc92b42ededa8"} Oct 01 14:01:01 crc kubenswrapper[4913]: I1001 14:01:01.928070 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29322121-gg2t8" podStartSLOduration=1.9280530649999998 podStartE2EDuration="1.928053065s" podCreationTimestamp="2025-10-01 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:01:01.924452833 +0000 UTC m=+4993.827928421" watchObservedRunningTime="2025-10-01 14:01:01.928053065 +0000 UTC m=+4993.831528643" Oct 01 14:01:03 crc kubenswrapper[4913]: I1001 14:01:03.943111 4913 generic.go:334] "Generic (PLEG): container finished" podID="f06702e3-f9e7-4e79-99d7-24b3203a1051" containerID="655ed20a5d94dcdc8767cd90e1ce2c4042d9200e01329cca1cc68be6ae16a917" exitCode=0 Oct 01 14:01:03 crc kubenswrapper[4913]: I1001 14:01:03.943401 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322121-gg2t8" event={"ID":"f06702e3-f9e7-4e79-99d7-24b3203a1051","Type":"ContainerDied","Data":"655ed20a5d94dcdc8767cd90e1ce2c4042d9200e01329cca1cc68be6ae16a917"} Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.368324 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.555730 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-config-data\") pod \"f06702e3-f9e7-4e79-99d7-24b3203a1051\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.555866 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hx98\" (UniqueName: \"kubernetes.io/projected/f06702e3-f9e7-4e79-99d7-24b3203a1051-kube-api-access-2hx98\") pod \"f06702e3-f9e7-4e79-99d7-24b3203a1051\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.555953 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-combined-ca-bundle\") pod \"f06702e3-f9e7-4e79-99d7-24b3203a1051\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.555970 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-fernet-keys\") pod \"f06702e3-f9e7-4e79-99d7-24b3203a1051\" (UID: \"f06702e3-f9e7-4e79-99d7-24b3203a1051\") " Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.562206 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f06702e3-f9e7-4e79-99d7-24b3203a1051-kube-api-access-2hx98" (OuterVolumeSpecName: "kube-api-access-2hx98") pod "f06702e3-f9e7-4e79-99d7-24b3203a1051" (UID: "f06702e3-f9e7-4e79-99d7-24b3203a1051"). InnerVolumeSpecName "kube-api-access-2hx98". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.568446 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f06702e3-f9e7-4e79-99d7-24b3203a1051" (UID: "f06702e3-f9e7-4e79-99d7-24b3203a1051"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.589213 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f06702e3-f9e7-4e79-99d7-24b3203a1051" (UID: "f06702e3-f9e7-4e79-99d7-24b3203a1051"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.610665 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-config-data" (OuterVolumeSpecName: "config-data") pod "f06702e3-f9e7-4e79-99d7-24b3203a1051" (UID: "f06702e3-f9e7-4e79-99d7-24b3203a1051"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.658531 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.658568 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hx98\" (UniqueName: \"kubernetes.io/projected/f06702e3-f9e7-4e79-99d7-24b3203a1051-kube-api-access-2hx98\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.658579 4913 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.658588 4913 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f06702e3-f9e7-4e79-99d7-24b3203a1051-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.806308 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:01:05 crc kubenswrapper[4913]: E1001 14:01:05.806691 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.964152 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322121-gg2t8" event={"ID":"f06702e3-f9e7-4e79-99d7-24b3203a1051","Type":"ContainerDied","Data":"43fbfd0d2fa8432a128b1b6c6cbd148720178c0c3dd492c9162fc92b42ededa8"} Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.964191 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43fbfd0d2fa8432a128b1b6c6cbd148720178c0c3dd492c9162fc92b42ededa8" Oct 01 14:01:05 crc kubenswrapper[4913]: I1001 14:01:05.964370 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322121-gg2t8" Oct 01 14:01:19 crc kubenswrapper[4913]: I1001 14:01:19.808281 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:01:19 crc kubenswrapper[4913]: E1001 14:01:19.811079 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:01:32 crc kubenswrapper[4913]: I1001 14:01:32.807657 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:01:32 crc kubenswrapper[4913]: E1001 14:01:32.808769 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:01:44 crc kubenswrapper[4913]: I1001 14:01:44.809652 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:01:44 crc kubenswrapper[4913]: E1001 14:01:44.810387 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:01:58 crc kubenswrapper[4913]: I1001 14:01:58.813185 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:01:58 crc kubenswrapper[4913]: E1001 14:01:58.814005 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:02:13 crc kubenswrapper[4913]: I1001 14:02:13.807095 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:02:13 crc kubenswrapper[4913]: E1001 14:02:13.808121 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:02:25 crc kubenswrapper[4913]: I1001 14:02:25.807654 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:02:25 crc kubenswrapper[4913]: E1001 14:02:25.809572 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:02:38 crc kubenswrapper[4913]: I1001 14:02:38.806786 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:02:38 crc kubenswrapper[4913]: E1001 14:02:38.807861 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:02:51 crc kubenswrapper[4913]: I1001 14:02:51.806596 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:02:51 crc kubenswrapper[4913]: E1001 14:02:51.808592 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:03:04 crc kubenswrapper[4913]: I1001 14:03:04.806481 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:03:04 crc kubenswrapper[4913]: E1001 14:03:04.807348 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:03:19 crc kubenswrapper[4913]: I1001 14:03:19.807347 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:03:19 crc kubenswrapper[4913]: E1001 14:03:19.808253 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:03:31 crc kubenswrapper[4913]: I1001 14:03:31.807940 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:03:31 crc kubenswrapper[4913]: E1001 14:03:31.808789 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:03:39 crc kubenswrapper[4913]: I1001 14:03:39.398153 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bl58x"] Oct 01 14:03:39 crc kubenswrapper[4913]: E1001 14:03:39.399192 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f06702e3-f9e7-4e79-99d7-24b3203a1051" containerName="keystone-cron" Oct 01 14:03:39 crc kubenswrapper[4913]: I1001 14:03:39.399208 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="f06702e3-f9e7-4e79-99d7-24b3203a1051" containerName="keystone-cron" Oct 01 14:03:39 crc kubenswrapper[4913]: I1001 14:03:39.399423 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="f06702e3-f9e7-4e79-99d7-24b3203a1051" containerName="keystone-cron" Oct 01 14:03:39 crc kubenswrapper[4913]: I1001 14:03:39.400881 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:03:39 crc kubenswrapper[4913]: I1001 14:03:39.422008 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bl58x"] Oct 01 14:03:39 crc kubenswrapper[4913]: I1001 14:03:39.448991 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-utilities\") pod \"certified-operators-bl58x\" (UID: \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\") " pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:03:39 crc kubenswrapper[4913]: I1001 14:03:39.449129 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghhjg\" (UniqueName: \"kubernetes.io/projected/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-kube-api-access-ghhjg\") pod \"certified-operators-bl58x\" (UID: \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\") " pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:03:39 crc kubenswrapper[4913]: I1001 14:03:39.449164 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-catalog-content\") pod \"certified-operators-bl58x\" (UID: \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\") " pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:03:39 crc kubenswrapper[4913]: I1001 14:03:39.551112 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-catalog-content\") pod \"certified-operators-bl58x\" (UID: \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\") " pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:03:39 crc kubenswrapper[4913]: I1001 14:03:39.551259 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-utilities\") pod \"certified-operators-bl58x\" (UID: \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\") " pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:03:39 crc kubenswrapper[4913]: I1001 14:03:39.551415 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghhjg\" (UniqueName: \"kubernetes.io/projected/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-kube-api-access-ghhjg\") pod \"certified-operators-bl58x\" (UID: \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\") " pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:03:39 crc kubenswrapper[4913]: I1001 14:03:39.552432 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-catalog-content\") pod \"certified-operators-bl58x\" (UID: \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\") " pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:03:39 crc kubenswrapper[4913]: I1001 14:03:39.552511 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-utilities\") pod \"certified-operators-bl58x\" (UID: \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\") " pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:03:39 crc kubenswrapper[4913]: I1001 14:03:39.804749 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghhjg\" (UniqueName: \"kubernetes.io/projected/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-kube-api-access-ghhjg\") pod \"certified-operators-bl58x\" (UID: \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\") " pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:03:40 crc kubenswrapper[4913]: I1001 14:03:40.030837 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:03:40 crc kubenswrapper[4913]: I1001 14:03:40.506777 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bl58x"] Oct 01 14:03:41 crc kubenswrapper[4913]: I1001 14:03:41.241513 4913 generic.go:334] "Generic (PLEG): container finished" podID="affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" containerID="c141073d58016738b81035e854c98f3b9d8e7d1ac008959d7feeec76e08b24fe" exitCode=0 Oct 01 14:03:41 crc kubenswrapper[4913]: I1001 14:03:41.241626 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bl58x" event={"ID":"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b","Type":"ContainerDied","Data":"c141073d58016738b81035e854c98f3b9d8e7d1ac008959d7feeec76e08b24fe"} Oct 01 14:03:41 crc kubenswrapper[4913]: I1001 14:03:41.241812 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bl58x" event={"ID":"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b","Type":"ContainerStarted","Data":"7c5abc5837c50688ea4ecbca511519395fc2a903400a8edfd7497d61b14941e4"} Oct 01 14:03:42 crc kubenswrapper[4913]: I1001 14:03:42.252773 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bl58x" event={"ID":"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b","Type":"ContainerStarted","Data":"8448d327686339da9c7131ad7521c279e95b4396e9608f968fe126810851eaa8"} Oct 01 14:03:45 crc kubenswrapper[4913]: I1001 14:03:45.281148 4913 generic.go:334] "Generic (PLEG): container finished" podID="affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" containerID="8448d327686339da9c7131ad7521c279e95b4396e9608f968fe126810851eaa8" exitCode=0 Oct 01 14:03:45 crc kubenswrapper[4913]: I1001 14:03:45.281238 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bl58x" event={"ID":"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b","Type":"ContainerDied","Data":"8448d327686339da9c7131ad7521c279e95b4396e9608f968fe126810851eaa8"} Oct 01 14:03:46 crc kubenswrapper[4913]: I1001 14:03:46.301586 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bl58x" event={"ID":"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b","Type":"ContainerStarted","Data":"05d5c6c447bcafc126a6cc2a280f098af9cb23c97bd9477f3f7c0922b8c225a7"} Oct 01 14:03:46 crc kubenswrapper[4913]: I1001 14:03:46.325957 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bl58x" podStartSLOduration=2.660119611 podStartE2EDuration="7.325933443s" podCreationTimestamp="2025-10-01 14:03:39 +0000 UTC" firstStartedPulling="2025-10-01 14:03:41.243519218 +0000 UTC m=+5153.146994796" lastFinishedPulling="2025-10-01 14:03:45.90933304 +0000 UTC m=+5157.812808628" observedRunningTime="2025-10-01 14:03:46.321313633 +0000 UTC m=+5158.224789241" watchObservedRunningTime="2025-10-01 14:03:46.325933443 +0000 UTC m=+5158.229409031" Oct 01 14:03:46 crc kubenswrapper[4913]: I1001 14:03:46.806847 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:03:46 crc kubenswrapper[4913]: E1001 14:03:46.807193 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:03:50 crc kubenswrapper[4913]: I1001 14:03:50.031494 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:03:50 crc kubenswrapper[4913]: I1001 14:03:50.031996 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:03:50 crc kubenswrapper[4913]: I1001 14:03:50.082378 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:04:00 crc kubenswrapper[4913]: I1001 14:04:00.077426 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:04:00 crc kubenswrapper[4913]: I1001 14:04:00.130884 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bl58x"] Oct 01 14:04:00 crc kubenswrapper[4913]: I1001 14:04:00.439383 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bl58x" podUID="affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" containerName="registry-server" containerID="cri-o://05d5c6c447bcafc126a6cc2a280f098af9cb23c97bd9477f3f7c0922b8c225a7" gracePeriod=2 Oct 01 14:04:00 crc kubenswrapper[4913]: I1001 14:04:00.892521 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:04:00 crc kubenswrapper[4913]: I1001 14:04:00.975612 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghhjg\" (UniqueName: \"kubernetes.io/projected/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-kube-api-access-ghhjg\") pod \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\" (UID: \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\") " Oct 01 14:04:00 crc kubenswrapper[4913]: I1001 14:04:00.975972 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-catalog-content\") pod \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\" (UID: \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\") " Oct 01 14:04:00 crc kubenswrapper[4913]: I1001 14:04:00.976049 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-utilities\") pod \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\" (UID: \"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b\") " Oct 01 14:04:00 crc kubenswrapper[4913]: I1001 14:04:00.977407 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-utilities" (OuterVolumeSpecName: "utilities") pod "affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" (UID: "affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:04:00 crc kubenswrapper[4913]: I1001 14:04:00.977685 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:04:00 crc kubenswrapper[4913]: I1001 14:04:00.998487 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-kube-api-access-ghhjg" (OuterVolumeSpecName: "kube-api-access-ghhjg") pod "affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" (UID: "affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b"). InnerVolumeSpecName "kube-api-access-ghhjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:04:01 crc kubenswrapper[4913]: I1001 14:04:01.030102 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" (UID: "affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:04:01 crc kubenswrapper[4913]: I1001 14:04:01.079354 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghhjg\" (UniqueName: \"kubernetes.io/projected/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-kube-api-access-ghhjg\") on node \"crc\" DevicePath \"\"" Oct 01 14:04:01 crc kubenswrapper[4913]: I1001 14:04:01.079385 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:04:01 crc kubenswrapper[4913]: I1001 14:04:01.452701 4913 generic.go:334] "Generic (PLEG): container finished" podID="affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" containerID="05d5c6c447bcafc126a6cc2a280f098af9cb23c97bd9477f3f7c0922b8c225a7" exitCode=0 Oct 01 14:04:01 crc kubenswrapper[4913]: I1001 14:04:01.452759 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bl58x" event={"ID":"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b","Type":"ContainerDied","Data":"05d5c6c447bcafc126a6cc2a280f098af9cb23c97bd9477f3f7c0922b8c225a7"} Oct 01 14:04:01 crc kubenswrapper[4913]: I1001 14:04:01.452793 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bl58x" Oct 01 14:04:01 crc kubenswrapper[4913]: I1001 14:04:01.452816 4913 scope.go:117] "RemoveContainer" containerID="05d5c6c447bcafc126a6cc2a280f098af9cb23c97bd9477f3f7c0922b8c225a7" Oct 01 14:04:01 crc kubenswrapper[4913]: I1001 14:04:01.452800 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bl58x" event={"ID":"affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b","Type":"ContainerDied","Data":"7c5abc5837c50688ea4ecbca511519395fc2a903400a8edfd7497d61b14941e4"} Oct 01 14:04:01 crc kubenswrapper[4913]: I1001 14:04:01.493169 4913 scope.go:117] "RemoveContainer" containerID="8448d327686339da9c7131ad7521c279e95b4396e9608f968fe126810851eaa8" Oct 01 14:04:01 crc kubenswrapper[4913]: I1001 14:04:01.499813 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bl58x"] Oct 01 14:04:01 crc kubenswrapper[4913]: I1001 14:04:01.508800 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bl58x"] Oct 01 14:04:01 crc kubenswrapper[4913]: I1001 14:04:01.806972 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:04:01 crc kubenswrapper[4913]: E1001 14:04:01.807315 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:04:02 crc kubenswrapper[4913]: I1001 14:04:02.015097 4913 scope.go:117] "RemoveContainer" containerID="c141073d58016738b81035e854c98f3b9d8e7d1ac008959d7feeec76e08b24fe" Oct 01 14:04:02 crc kubenswrapper[4913]: I1001 14:04:02.065143 4913 scope.go:117] "RemoveContainer" containerID="05d5c6c447bcafc126a6cc2a280f098af9cb23c97bd9477f3f7c0922b8c225a7" Oct 01 14:04:02 crc kubenswrapper[4913]: E1001 14:04:02.065619 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05d5c6c447bcafc126a6cc2a280f098af9cb23c97bd9477f3f7c0922b8c225a7\": container with ID starting with 05d5c6c447bcafc126a6cc2a280f098af9cb23c97bd9477f3f7c0922b8c225a7 not found: ID does not exist" containerID="05d5c6c447bcafc126a6cc2a280f098af9cb23c97bd9477f3f7c0922b8c225a7" Oct 01 14:04:02 crc kubenswrapper[4913]: I1001 14:04:02.065673 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05d5c6c447bcafc126a6cc2a280f098af9cb23c97bd9477f3f7c0922b8c225a7"} err="failed to get container status \"05d5c6c447bcafc126a6cc2a280f098af9cb23c97bd9477f3f7c0922b8c225a7\": rpc error: code = NotFound desc = could not find container \"05d5c6c447bcafc126a6cc2a280f098af9cb23c97bd9477f3f7c0922b8c225a7\": container with ID starting with 05d5c6c447bcafc126a6cc2a280f098af9cb23c97bd9477f3f7c0922b8c225a7 not found: ID does not exist" Oct 01 14:04:02 crc kubenswrapper[4913]: I1001 14:04:02.065702 4913 scope.go:117] "RemoveContainer" containerID="8448d327686339da9c7131ad7521c279e95b4396e9608f968fe126810851eaa8" Oct 01 14:04:02 crc kubenswrapper[4913]: E1001 14:04:02.066088 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8448d327686339da9c7131ad7521c279e95b4396e9608f968fe126810851eaa8\": container with ID starting with 8448d327686339da9c7131ad7521c279e95b4396e9608f968fe126810851eaa8 not found: ID does not exist" containerID="8448d327686339da9c7131ad7521c279e95b4396e9608f968fe126810851eaa8" Oct 01 14:04:02 crc kubenswrapper[4913]: I1001 14:04:02.066140 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8448d327686339da9c7131ad7521c279e95b4396e9608f968fe126810851eaa8"} err="failed to get container status \"8448d327686339da9c7131ad7521c279e95b4396e9608f968fe126810851eaa8\": rpc error: code = NotFound desc = could not find container \"8448d327686339da9c7131ad7521c279e95b4396e9608f968fe126810851eaa8\": container with ID starting with 8448d327686339da9c7131ad7521c279e95b4396e9608f968fe126810851eaa8 not found: ID does not exist" Oct 01 14:04:02 crc kubenswrapper[4913]: I1001 14:04:02.066178 4913 scope.go:117] "RemoveContainer" containerID="c141073d58016738b81035e854c98f3b9d8e7d1ac008959d7feeec76e08b24fe" Oct 01 14:04:02 crc kubenswrapper[4913]: E1001 14:04:02.066497 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c141073d58016738b81035e854c98f3b9d8e7d1ac008959d7feeec76e08b24fe\": container with ID starting with c141073d58016738b81035e854c98f3b9d8e7d1ac008959d7feeec76e08b24fe not found: ID does not exist" containerID="c141073d58016738b81035e854c98f3b9d8e7d1ac008959d7feeec76e08b24fe" Oct 01 14:04:02 crc kubenswrapper[4913]: I1001 14:04:02.066523 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c141073d58016738b81035e854c98f3b9d8e7d1ac008959d7feeec76e08b24fe"} err="failed to get container status \"c141073d58016738b81035e854c98f3b9d8e7d1ac008959d7feeec76e08b24fe\": rpc error: code = NotFound desc = could not find container \"c141073d58016738b81035e854c98f3b9d8e7d1ac008959d7feeec76e08b24fe\": container with ID starting with c141073d58016738b81035e854c98f3b9d8e7d1ac008959d7feeec76e08b24fe not found: ID does not exist" Oct 01 14:04:02 crc kubenswrapper[4913]: I1001 14:04:02.817746 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" path="/var/lib/kubelet/pods/affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b/volumes" Oct 01 14:04:15 crc kubenswrapper[4913]: I1001 14:04:15.806990 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:04:15 crc kubenswrapper[4913]: E1001 14:04:15.807824 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:04:26 crc kubenswrapper[4913]: I1001 14:04:26.806584 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:04:26 crc kubenswrapper[4913]: E1001 14:04:26.807362 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:04:40 crc kubenswrapper[4913]: I1001 14:04:40.807015 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:04:40 crc kubenswrapper[4913]: E1001 14:04:40.807888 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.350352 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x5lt2"] Oct 01 14:04:46 crc kubenswrapper[4913]: E1001 14:04:46.352022 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" containerName="extract-utilities" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.352048 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" containerName="extract-utilities" Oct 01 14:04:46 crc kubenswrapper[4913]: E1001 14:04:46.352065 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" containerName="registry-server" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.352072 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" containerName="registry-server" Oct 01 14:04:46 crc kubenswrapper[4913]: E1001 14:04:46.352095 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" containerName="extract-content" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.352102 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" containerName="extract-content" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.352396 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="affe2ca1-ae01-4aa6-9d1d-b7f82089ef9b" containerName="registry-server" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.355388 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.369870 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5lt2"] Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.464583 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-catalog-content\") pod \"redhat-marketplace-x5lt2\" (UID: \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\") " pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.465113 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-utilities\") pod \"redhat-marketplace-x5lt2\" (UID: \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\") " pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.465164 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgmsd\" (UniqueName: \"kubernetes.io/projected/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-kube-api-access-mgmsd\") pod \"redhat-marketplace-x5lt2\" (UID: \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\") " pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.567906 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-utilities\") pod \"redhat-marketplace-x5lt2\" (UID: \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\") " pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.568226 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgmsd\" (UniqueName: \"kubernetes.io/projected/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-kube-api-access-mgmsd\") pod \"redhat-marketplace-x5lt2\" (UID: \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\") " pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.568421 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-utilities\") pod \"redhat-marketplace-x5lt2\" (UID: \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\") " pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.568538 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-catalog-content\") pod \"redhat-marketplace-x5lt2\" (UID: \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\") " pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.568822 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-catalog-content\") pod \"redhat-marketplace-x5lt2\" (UID: \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\") " pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.593671 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgmsd\" (UniqueName: \"kubernetes.io/projected/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-kube-api-access-mgmsd\") pod \"redhat-marketplace-x5lt2\" (UID: \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\") " pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:46 crc kubenswrapper[4913]: I1001 14:04:46.685121 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:47 crc kubenswrapper[4913]: I1001 14:04:47.129975 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5lt2"] Oct 01 14:04:47 crc kubenswrapper[4913]: I1001 14:04:47.860026 4913 generic.go:334] "Generic (PLEG): container finished" podID="0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" containerID="e680314212828387c1df1a4f7b10e66db43a8412585fd88ae7d1101e9cf46743" exitCode=0 Oct 01 14:04:47 crc kubenswrapper[4913]: I1001 14:04:47.860084 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5lt2" event={"ID":"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb","Type":"ContainerDied","Data":"e680314212828387c1df1a4f7b10e66db43a8412585fd88ae7d1101e9cf46743"} Oct 01 14:04:47 crc kubenswrapper[4913]: I1001 14:04:47.860395 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5lt2" event={"ID":"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb","Type":"ContainerStarted","Data":"8c668b4a7805a390e8ed4849c30cdb75d22dc8537b1cac615d3096f47cf8fe6b"} Oct 01 14:04:48 crc kubenswrapper[4913]: I1001 14:04:48.872584 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5lt2" event={"ID":"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb","Type":"ContainerStarted","Data":"48b457dccef4b563d18cea94db48e1425a34d2a60675a4034a09b65916db3cde"} Oct 01 14:04:49 crc kubenswrapper[4913]: I1001 14:04:49.883739 4913 generic.go:334] "Generic (PLEG): container finished" podID="0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" containerID="48b457dccef4b563d18cea94db48e1425a34d2a60675a4034a09b65916db3cde" exitCode=0 Oct 01 14:04:49 crc kubenswrapper[4913]: I1001 14:04:49.883851 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5lt2" event={"ID":"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb","Type":"ContainerDied","Data":"48b457dccef4b563d18cea94db48e1425a34d2a60675a4034a09b65916db3cde"} Oct 01 14:04:50 crc kubenswrapper[4913]: I1001 14:04:50.894075 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5lt2" event={"ID":"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb","Type":"ContainerStarted","Data":"f62232a6f6b19a7d5e59f720a9875e109e17446087ddccd1f306c54168c3a262"} Oct 01 14:04:50 crc kubenswrapper[4913]: I1001 14:04:50.917011 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x5lt2" podStartSLOduration=2.222442025 podStartE2EDuration="4.916988857s" podCreationTimestamp="2025-10-01 14:04:46 +0000 UTC" firstStartedPulling="2025-10-01 14:04:47.862066609 +0000 UTC m=+5219.765542187" lastFinishedPulling="2025-10-01 14:04:50.556613441 +0000 UTC m=+5222.460089019" observedRunningTime="2025-10-01 14:04:50.909937969 +0000 UTC m=+5222.813413567" watchObservedRunningTime="2025-10-01 14:04:50.916988857 +0000 UTC m=+5222.820464435" Oct 01 14:04:52 crc kubenswrapper[4913]: I1001 14:04:52.807093 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:04:52 crc kubenswrapper[4913]: E1001 14:04:52.807660 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:04:56 crc kubenswrapper[4913]: I1001 14:04:56.685811 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:56 crc kubenswrapper[4913]: I1001 14:04:56.686405 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:56 crc kubenswrapper[4913]: I1001 14:04:56.733927 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:57 crc kubenswrapper[4913]: I1001 14:04:57.011038 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:57 crc kubenswrapper[4913]: I1001 14:04:57.064480 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5lt2"] Oct 01 14:04:58 crc kubenswrapper[4913]: I1001 14:04:58.980703 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x5lt2" podUID="0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" containerName="registry-server" containerID="cri-o://f62232a6f6b19a7d5e59f720a9875e109e17446087ddccd1f306c54168c3a262" gracePeriod=2 Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.545700 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.623406 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-catalog-content\") pod \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\" (UID: \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\") " Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.623564 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-utilities\") pod \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\" (UID: \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\") " Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.623667 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgmsd\" (UniqueName: \"kubernetes.io/projected/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-kube-api-access-mgmsd\") pod \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\" (UID: \"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb\") " Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.624614 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-utilities" (OuterVolumeSpecName: "utilities") pod "0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" (UID: "0fd3f3c6-9cf6-4139-b767-dc628e51fbeb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.629302 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-kube-api-access-mgmsd" (OuterVolumeSpecName: "kube-api-access-mgmsd") pod "0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" (UID: "0fd3f3c6-9cf6-4139-b767-dc628e51fbeb"). InnerVolumeSpecName "kube-api-access-mgmsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.725650 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.725688 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgmsd\" (UniqueName: \"kubernetes.io/projected/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-kube-api-access-mgmsd\") on node \"crc\" DevicePath \"\"" Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.894472 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" (UID: "0fd3f3c6-9cf6-4139-b767-dc628e51fbeb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.929032 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.995742 4913 generic.go:334] "Generic (PLEG): container finished" podID="0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" containerID="f62232a6f6b19a7d5e59f720a9875e109e17446087ddccd1f306c54168c3a262" exitCode=0 Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.995795 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5lt2" event={"ID":"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb","Type":"ContainerDied","Data":"f62232a6f6b19a7d5e59f720a9875e109e17446087ddccd1f306c54168c3a262"} Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.995839 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5lt2" event={"ID":"0fd3f3c6-9cf6-4139-b767-dc628e51fbeb","Type":"ContainerDied","Data":"8c668b4a7805a390e8ed4849c30cdb75d22dc8537b1cac615d3096f47cf8fe6b"} Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.995866 4913 scope.go:117] "RemoveContainer" containerID="f62232a6f6b19a7d5e59f720a9875e109e17446087ddccd1f306c54168c3a262" Oct 01 14:04:59 crc kubenswrapper[4913]: I1001 14:04:59.995898 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5lt2" Oct 01 14:05:00 crc kubenswrapper[4913]: I1001 14:05:00.014981 4913 scope.go:117] "RemoveContainer" containerID="48b457dccef4b563d18cea94db48e1425a34d2a60675a4034a09b65916db3cde" Oct 01 14:05:00 crc kubenswrapper[4913]: I1001 14:05:00.031888 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5lt2"] Oct 01 14:05:00 crc kubenswrapper[4913]: I1001 14:05:00.042240 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5lt2"] Oct 01 14:05:00 crc kubenswrapper[4913]: I1001 14:05:00.055528 4913 scope.go:117] "RemoveContainer" containerID="e680314212828387c1df1a4f7b10e66db43a8412585fd88ae7d1101e9cf46743" Oct 01 14:05:00 crc kubenswrapper[4913]: I1001 14:05:00.091889 4913 scope.go:117] "RemoveContainer" containerID="f62232a6f6b19a7d5e59f720a9875e109e17446087ddccd1f306c54168c3a262" Oct 01 14:05:00 crc kubenswrapper[4913]: E1001 14:05:00.093054 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f62232a6f6b19a7d5e59f720a9875e109e17446087ddccd1f306c54168c3a262\": container with ID starting with f62232a6f6b19a7d5e59f720a9875e109e17446087ddccd1f306c54168c3a262 not found: ID does not exist" containerID="f62232a6f6b19a7d5e59f720a9875e109e17446087ddccd1f306c54168c3a262" Oct 01 14:05:00 crc kubenswrapper[4913]: I1001 14:05:00.093132 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f62232a6f6b19a7d5e59f720a9875e109e17446087ddccd1f306c54168c3a262"} err="failed to get container status \"f62232a6f6b19a7d5e59f720a9875e109e17446087ddccd1f306c54168c3a262\": rpc error: code = NotFound desc = could not find container \"f62232a6f6b19a7d5e59f720a9875e109e17446087ddccd1f306c54168c3a262\": container with ID starting with f62232a6f6b19a7d5e59f720a9875e109e17446087ddccd1f306c54168c3a262 not found: ID does not exist" Oct 01 14:05:00 crc kubenswrapper[4913]: I1001 14:05:00.093177 4913 scope.go:117] "RemoveContainer" containerID="48b457dccef4b563d18cea94db48e1425a34d2a60675a4034a09b65916db3cde" Oct 01 14:05:00 crc kubenswrapper[4913]: E1001 14:05:00.093704 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48b457dccef4b563d18cea94db48e1425a34d2a60675a4034a09b65916db3cde\": container with ID starting with 48b457dccef4b563d18cea94db48e1425a34d2a60675a4034a09b65916db3cde not found: ID does not exist" containerID="48b457dccef4b563d18cea94db48e1425a34d2a60675a4034a09b65916db3cde" Oct 01 14:05:00 crc kubenswrapper[4913]: I1001 14:05:00.093749 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48b457dccef4b563d18cea94db48e1425a34d2a60675a4034a09b65916db3cde"} err="failed to get container status \"48b457dccef4b563d18cea94db48e1425a34d2a60675a4034a09b65916db3cde\": rpc error: code = NotFound desc = could not find container \"48b457dccef4b563d18cea94db48e1425a34d2a60675a4034a09b65916db3cde\": container with ID starting with 48b457dccef4b563d18cea94db48e1425a34d2a60675a4034a09b65916db3cde not found: ID does not exist" Oct 01 14:05:00 crc kubenswrapper[4913]: I1001 14:05:00.093780 4913 scope.go:117] "RemoveContainer" containerID="e680314212828387c1df1a4f7b10e66db43a8412585fd88ae7d1101e9cf46743" Oct 01 14:05:00 crc kubenswrapper[4913]: E1001 14:05:00.094400 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e680314212828387c1df1a4f7b10e66db43a8412585fd88ae7d1101e9cf46743\": container with ID starting with e680314212828387c1df1a4f7b10e66db43a8412585fd88ae7d1101e9cf46743 not found: ID does not exist" containerID="e680314212828387c1df1a4f7b10e66db43a8412585fd88ae7d1101e9cf46743" Oct 01 14:05:00 crc kubenswrapper[4913]: I1001 14:05:00.094454 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e680314212828387c1df1a4f7b10e66db43a8412585fd88ae7d1101e9cf46743"} err="failed to get container status \"e680314212828387c1df1a4f7b10e66db43a8412585fd88ae7d1101e9cf46743\": rpc error: code = NotFound desc = could not find container \"e680314212828387c1df1a4f7b10e66db43a8412585fd88ae7d1101e9cf46743\": container with ID starting with e680314212828387c1df1a4f7b10e66db43a8412585fd88ae7d1101e9cf46743 not found: ID does not exist" Oct 01 14:05:00 crc kubenswrapper[4913]: I1001 14:05:00.820035 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" path="/var/lib/kubelet/pods/0fd3f3c6-9cf6-4139-b767-dc628e51fbeb/volumes" Oct 01 14:05:04 crc kubenswrapper[4913]: I1001 14:05:04.808545 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:05:04 crc kubenswrapper[4913]: E1001 14:05:04.809703 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:05:18 crc kubenswrapper[4913]: I1001 14:05:18.815184 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:05:19 crc kubenswrapper[4913]: I1001 14:05:19.154097 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"2a168adc900961b11e5e23917c601c486864f29c09c1f4d1e5b3ce73a676338c"} Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.533952 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vk569"] Oct 01 14:05:28 crc kubenswrapper[4913]: E1001 14:05:28.536358 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" containerName="extract-utilities" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.536385 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" containerName="extract-utilities" Oct 01 14:05:28 crc kubenswrapper[4913]: E1001 14:05:28.536412 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" containerName="extract-content" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.536420 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" containerName="extract-content" Oct 01 14:05:28 crc kubenswrapper[4913]: E1001 14:05:28.536457 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" containerName="registry-server" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.536464 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" containerName="registry-server" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.536655 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd3f3c6-9cf6-4139-b767-dc628e51fbeb" containerName="registry-server" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.539692 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.546971 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vk569"] Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.622957 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1422e82-a424-4667-86a9-4a427f7e188f-catalog-content\") pod \"community-operators-vk569\" (UID: \"b1422e82-a424-4667-86a9-4a427f7e188f\") " pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.623036 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdp8m\" (UniqueName: \"kubernetes.io/projected/b1422e82-a424-4667-86a9-4a427f7e188f-kube-api-access-cdp8m\") pod \"community-operators-vk569\" (UID: \"b1422e82-a424-4667-86a9-4a427f7e188f\") " pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.623067 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1422e82-a424-4667-86a9-4a427f7e188f-utilities\") pod \"community-operators-vk569\" (UID: \"b1422e82-a424-4667-86a9-4a427f7e188f\") " pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.724474 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1422e82-a424-4667-86a9-4a427f7e188f-catalog-content\") pod \"community-operators-vk569\" (UID: \"b1422e82-a424-4667-86a9-4a427f7e188f\") " pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.724543 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdp8m\" (UniqueName: \"kubernetes.io/projected/b1422e82-a424-4667-86a9-4a427f7e188f-kube-api-access-cdp8m\") pod \"community-operators-vk569\" (UID: \"b1422e82-a424-4667-86a9-4a427f7e188f\") " pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.724564 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1422e82-a424-4667-86a9-4a427f7e188f-utilities\") pod \"community-operators-vk569\" (UID: \"b1422e82-a424-4667-86a9-4a427f7e188f\") " pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.725014 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1422e82-a424-4667-86a9-4a427f7e188f-catalog-content\") pod \"community-operators-vk569\" (UID: \"b1422e82-a424-4667-86a9-4a427f7e188f\") " pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.725091 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1422e82-a424-4667-86a9-4a427f7e188f-utilities\") pod \"community-operators-vk569\" (UID: \"b1422e82-a424-4667-86a9-4a427f7e188f\") " pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.746295 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdp8m\" (UniqueName: \"kubernetes.io/projected/b1422e82-a424-4667-86a9-4a427f7e188f-kube-api-access-cdp8m\") pod \"community-operators-vk569\" (UID: \"b1422e82-a424-4667-86a9-4a427f7e188f\") " pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:28 crc kubenswrapper[4913]: I1001 14:05:28.859241 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:29 crc kubenswrapper[4913]: I1001 14:05:29.378846 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vk569"] Oct 01 14:05:30 crc kubenswrapper[4913]: I1001 14:05:30.250312 4913 generic.go:334] "Generic (PLEG): container finished" podID="b1422e82-a424-4667-86a9-4a427f7e188f" containerID="21849763b3418eed83c386b74d6d5177f1069c9af01ac71df9ff801932112947" exitCode=0 Oct 01 14:05:30 crc kubenswrapper[4913]: I1001 14:05:30.250582 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk569" event={"ID":"b1422e82-a424-4667-86a9-4a427f7e188f","Type":"ContainerDied","Data":"21849763b3418eed83c386b74d6d5177f1069c9af01ac71df9ff801932112947"} Oct 01 14:05:30 crc kubenswrapper[4913]: I1001 14:05:30.250609 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk569" event={"ID":"b1422e82-a424-4667-86a9-4a427f7e188f","Type":"ContainerStarted","Data":"1211d2c7911fc291720b30765571d42998a0a895c11575bcfe78cd5777313a9d"} Oct 01 14:05:30 crc kubenswrapper[4913]: I1001 14:05:30.256009 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:05:32 crc kubenswrapper[4913]: I1001 14:05:32.269313 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk569" event={"ID":"b1422e82-a424-4667-86a9-4a427f7e188f","Type":"ContainerStarted","Data":"c3123373f5001e924a86efe7fc50dc61de353f636a73f024057b89337a1a5106"} Oct 01 14:05:33 crc kubenswrapper[4913]: I1001 14:05:33.281568 4913 generic.go:334] "Generic (PLEG): container finished" podID="b1422e82-a424-4667-86a9-4a427f7e188f" containerID="c3123373f5001e924a86efe7fc50dc61de353f636a73f024057b89337a1a5106" exitCode=0 Oct 01 14:05:33 crc kubenswrapper[4913]: I1001 14:05:33.281685 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk569" event={"ID":"b1422e82-a424-4667-86a9-4a427f7e188f","Type":"ContainerDied","Data":"c3123373f5001e924a86efe7fc50dc61de353f636a73f024057b89337a1a5106"} Oct 01 14:05:35 crc kubenswrapper[4913]: I1001 14:05:35.327054 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk569" event={"ID":"b1422e82-a424-4667-86a9-4a427f7e188f","Type":"ContainerStarted","Data":"0030e94b89468d8a5e83ac956aec034c4ad80d9bca314fa308cc3ddfea32fc42"} Oct 01 14:05:35 crc kubenswrapper[4913]: I1001 14:05:35.350399 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vk569" podStartSLOduration=3.505639154 podStartE2EDuration="7.350382312s" podCreationTimestamp="2025-10-01 14:05:28 +0000 UTC" firstStartedPulling="2025-10-01 14:05:30.255699233 +0000 UTC m=+5262.159174811" lastFinishedPulling="2025-10-01 14:05:34.100442391 +0000 UTC m=+5266.003917969" observedRunningTime="2025-10-01 14:05:35.345917657 +0000 UTC m=+5267.249393275" watchObservedRunningTime="2025-10-01 14:05:35.350382312 +0000 UTC m=+5267.253857890" Oct 01 14:05:38 crc kubenswrapper[4913]: I1001 14:05:38.859731 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:38 crc kubenswrapper[4913]: I1001 14:05:38.860287 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:38 crc kubenswrapper[4913]: I1001 14:05:38.903838 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:39 crc kubenswrapper[4913]: I1001 14:05:39.412853 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:39 crc kubenswrapper[4913]: I1001 14:05:39.462455 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vk569"] Oct 01 14:05:41 crc kubenswrapper[4913]: I1001 14:05:41.385877 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vk569" podUID="b1422e82-a424-4667-86a9-4a427f7e188f" containerName="registry-server" containerID="cri-o://0030e94b89468d8a5e83ac956aec034c4ad80d9bca314fa308cc3ddfea32fc42" gracePeriod=2 Oct 01 14:05:41 crc kubenswrapper[4913]: I1001 14:05:41.860213 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:41 crc kubenswrapper[4913]: I1001 14:05:41.993705 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdp8m\" (UniqueName: \"kubernetes.io/projected/b1422e82-a424-4667-86a9-4a427f7e188f-kube-api-access-cdp8m\") pod \"b1422e82-a424-4667-86a9-4a427f7e188f\" (UID: \"b1422e82-a424-4667-86a9-4a427f7e188f\") " Oct 01 14:05:41 crc kubenswrapper[4913]: I1001 14:05:41.993850 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1422e82-a424-4667-86a9-4a427f7e188f-utilities\") pod \"b1422e82-a424-4667-86a9-4a427f7e188f\" (UID: \"b1422e82-a424-4667-86a9-4a427f7e188f\") " Oct 01 14:05:41 crc kubenswrapper[4913]: I1001 14:05:41.994011 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1422e82-a424-4667-86a9-4a427f7e188f-catalog-content\") pod \"b1422e82-a424-4667-86a9-4a427f7e188f\" (UID: \"b1422e82-a424-4667-86a9-4a427f7e188f\") " Oct 01 14:05:41 crc kubenswrapper[4913]: I1001 14:05:41.994661 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1422e82-a424-4667-86a9-4a427f7e188f-utilities" (OuterVolumeSpecName: "utilities") pod "b1422e82-a424-4667-86a9-4a427f7e188f" (UID: "b1422e82-a424-4667-86a9-4a427f7e188f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:05:41 crc kubenswrapper[4913]: I1001 14:05:41.994882 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1422e82-a424-4667-86a9-4a427f7e188f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.002375 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1422e82-a424-4667-86a9-4a427f7e188f-kube-api-access-cdp8m" (OuterVolumeSpecName: "kube-api-access-cdp8m") pod "b1422e82-a424-4667-86a9-4a427f7e188f" (UID: "b1422e82-a424-4667-86a9-4a427f7e188f"). InnerVolumeSpecName "kube-api-access-cdp8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.096681 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdp8m\" (UniqueName: \"kubernetes.io/projected/b1422e82-a424-4667-86a9-4a427f7e188f-kube-api-access-cdp8m\") on node \"crc\" DevicePath \"\"" Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.398615 4913 generic.go:334] "Generic (PLEG): container finished" podID="b1422e82-a424-4667-86a9-4a427f7e188f" containerID="0030e94b89468d8a5e83ac956aec034c4ad80d9bca314fa308cc3ddfea32fc42" exitCode=0 Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.398660 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vk569" Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.398677 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk569" event={"ID":"b1422e82-a424-4667-86a9-4a427f7e188f","Type":"ContainerDied","Data":"0030e94b89468d8a5e83ac956aec034c4ad80d9bca314fa308cc3ddfea32fc42"} Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.399111 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk569" event={"ID":"b1422e82-a424-4667-86a9-4a427f7e188f","Type":"ContainerDied","Data":"1211d2c7911fc291720b30765571d42998a0a895c11575bcfe78cd5777313a9d"} Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.399150 4913 scope.go:117] "RemoveContainer" containerID="0030e94b89468d8a5e83ac956aec034c4ad80d9bca314fa308cc3ddfea32fc42" Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.427717 4913 scope.go:117] "RemoveContainer" containerID="c3123373f5001e924a86efe7fc50dc61de353f636a73f024057b89337a1a5106" Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.466954 4913 scope.go:117] "RemoveContainer" containerID="21849763b3418eed83c386b74d6d5177f1069c9af01ac71df9ff801932112947" Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.511010 4913 scope.go:117] "RemoveContainer" containerID="0030e94b89468d8a5e83ac956aec034c4ad80d9bca314fa308cc3ddfea32fc42" Oct 01 14:05:42 crc kubenswrapper[4913]: E1001 14:05:42.514410 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0030e94b89468d8a5e83ac956aec034c4ad80d9bca314fa308cc3ddfea32fc42\": container with ID starting with 0030e94b89468d8a5e83ac956aec034c4ad80d9bca314fa308cc3ddfea32fc42 not found: ID does not exist" containerID="0030e94b89468d8a5e83ac956aec034c4ad80d9bca314fa308cc3ddfea32fc42" Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.514524 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0030e94b89468d8a5e83ac956aec034c4ad80d9bca314fa308cc3ddfea32fc42"} err="failed to get container status \"0030e94b89468d8a5e83ac956aec034c4ad80d9bca314fa308cc3ddfea32fc42\": rpc error: code = NotFound desc = could not find container \"0030e94b89468d8a5e83ac956aec034c4ad80d9bca314fa308cc3ddfea32fc42\": container with ID starting with 0030e94b89468d8a5e83ac956aec034c4ad80d9bca314fa308cc3ddfea32fc42 not found: ID does not exist" Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.514676 4913 scope.go:117] "RemoveContainer" containerID="c3123373f5001e924a86efe7fc50dc61de353f636a73f024057b89337a1a5106" Oct 01 14:05:42 crc kubenswrapper[4913]: E1001 14:05:42.515785 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3123373f5001e924a86efe7fc50dc61de353f636a73f024057b89337a1a5106\": container with ID starting with c3123373f5001e924a86efe7fc50dc61de353f636a73f024057b89337a1a5106 not found: ID does not exist" containerID="c3123373f5001e924a86efe7fc50dc61de353f636a73f024057b89337a1a5106" Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.515830 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3123373f5001e924a86efe7fc50dc61de353f636a73f024057b89337a1a5106"} err="failed to get container status \"c3123373f5001e924a86efe7fc50dc61de353f636a73f024057b89337a1a5106\": rpc error: code = NotFound desc = could not find container \"c3123373f5001e924a86efe7fc50dc61de353f636a73f024057b89337a1a5106\": container with ID starting with c3123373f5001e924a86efe7fc50dc61de353f636a73f024057b89337a1a5106 not found: ID does not exist" Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.515869 4913 scope.go:117] "RemoveContainer" containerID="21849763b3418eed83c386b74d6d5177f1069c9af01ac71df9ff801932112947" Oct 01 14:05:42 crc kubenswrapper[4913]: E1001 14:05:42.516213 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21849763b3418eed83c386b74d6d5177f1069c9af01ac71df9ff801932112947\": container with ID starting with 21849763b3418eed83c386b74d6d5177f1069c9af01ac71df9ff801932112947 not found: ID does not exist" containerID="21849763b3418eed83c386b74d6d5177f1069c9af01ac71df9ff801932112947" Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.516250 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21849763b3418eed83c386b74d6d5177f1069c9af01ac71df9ff801932112947"} err="failed to get container status \"21849763b3418eed83c386b74d6d5177f1069c9af01ac71df9ff801932112947\": rpc error: code = NotFound desc = could not find container \"21849763b3418eed83c386b74d6d5177f1069c9af01ac71df9ff801932112947\": container with ID starting with 21849763b3418eed83c386b74d6d5177f1069c9af01ac71df9ff801932112947 not found: ID does not exist" Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.873135 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1422e82-a424-4667-86a9-4a427f7e188f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1422e82-a424-4667-86a9-4a427f7e188f" (UID: "b1422e82-a424-4667-86a9-4a427f7e188f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:05:42 crc kubenswrapper[4913]: I1001 14:05:42.918122 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1422e82-a424-4667-86a9-4a427f7e188f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:05:43 crc kubenswrapper[4913]: I1001 14:05:43.072439 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vk569"] Oct 01 14:05:43 crc kubenswrapper[4913]: I1001 14:05:43.080836 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vk569"] Oct 01 14:05:44 crc kubenswrapper[4913]: I1001 14:05:44.826706 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1422e82-a424-4667-86a9-4a427f7e188f" path="/var/lib/kubelet/pods/b1422e82-a424-4667-86a9-4a427f7e188f/volumes" Oct 01 14:07:40 crc kubenswrapper[4913]: I1001 14:07:40.083927 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:07:40 crc kubenswrapper[4913]: I1001 14:07:40.084544 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:08:10 crc kubenswrapper[4913]: I1001 14:08:10.083493 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:08:10 crc kubenswrapper[4913]: I1001 14:08:10.083958 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:08:19 crc kubenswrapper[4913]: I1001 14:08:19.753213 4913 generic.go:334] "Generic (PLEG): container finished" podID="3ef07e68-9334-463a-888f-1fd9fe3d3f1c" containerID="79cbc0d54a3cf7042da99ffd6f6bffa0f768618707a168586c28b9d10ef20a92" exitCode=1 Oct 01 14:08:19 crc kubenswrapper[4913]: I1001 14:08:19.753316 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"3ef07e68-9334-463a-888f-1fd9fe3d3f1c","Type":"ContainerDied","Data":"79cbc0d54a3cf7042da99ffd6f6bffa0f768618707a168586c28b9d10ef20a92"} Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.196078 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.251401 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-openstack-config-secret\") pod \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.251623 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-test-operator-ephemeral-workdir\") pod \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.251708 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.251799 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ssh-key\") pod \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.251914 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ceph\") pod \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.251969 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-config-data\") pod \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.251991 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p44wb\" (UniqueName: \"kubernetes.io/projected/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-kube-api-access-p44wb\") pod \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.252038 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-openstack-config\") pod \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.252064 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-test-operator-ephemeral-temporary\") pod \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.252305 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ca-certs\") pod \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\" (UID: \"3ef07e68-9334-463a-888f-1fd9fe3d3f1c\") " Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.252710 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-config-data" (OuterVolumeSpecName: "config-data") pod "3ef07e68-9334-463a-888f-1fd9fe3d3f1c" (UID: "3ef07e68-9334-463a-888f-1fd9fe3d3f1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.252986 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "3ef07e68-9334-463a-888f-1fd9fe3d3f1c" (UID: "3ef07e68-9334-463a-888f-1fd9fe3d3f1c"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.253121 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.253144 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.265229 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "3ef07e68-9334-463a-888f-1fd9fe3d3f1c" (UID: "3ef07e68-9334-463a-888f-1fd9fe3d3f1c"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.267501 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Oct 01 14:08:21 crc kubenswrapper[4913]: E1001 14:08:21.267862 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1422e82-a424-4667-86a9-4a427f7e188f" containerName="registry-server" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.267878 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1422e82-a424-4667-86a9-4a427f7e188f" containerName="registry-server" Oct 01 14:08:21 crc kubenswrapper[4913]: E1001 14:08:21.267898 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1422e82-a424-4667-86a9-4a427f7e188f" containerName="extract-utilities" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.267906 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1422e82-a424-4667-86a9-4a427f7e188f" containerName="extract-utilities" Oct 01 14:08:21 crc kubenswrapper[4913]: E1001 14:08:21.267916 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef07e68-9334-463a-888f-1fd9fe3d3f1c" containerName="tempest-tests-tempest-tests-runner" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.267922 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef07e68-9334-463a-888f-1fd9fe3d3f1c" containerName="tempest-tests-tempest-tests-runner" Oct 01 14:08:21 crc kubenswrapper[4913]: E1001 14:08:21.267948 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1422e82-a424-4667-86a9-4a427f7e188f" containerName="extract-content" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.267954 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1422e82-a424-4667-86a9-4a427f7e188f" containerName="extract-content" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.267879 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ceph" (OuterVolumeSpecName: "ceph") pod "3ef07e68-9334-463a-888f-1fd9fe3d3f1c" (UID: "3ef07e68-9334-463a-888f-1fd9fe3d3f1c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.267888 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-kube-api-access-p44wb" (OuterVolumeSpecName: "kube-api-access-p44wb") pod "3ef07e68-9334-463a-888f-1fd9fe3d3f1c" (UID: "3ef07e68-9334-463a-888f-1fd9fe3d3f1c"). InnerVolumeSpecName "kube-api-access-p44wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.268118 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef07e68-9334-463a-888f-1fd9fe3d3f1c" containerName="tempest-tests-tempest-tests-runner" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.268142 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1422e82-a424-4667-86a9-4a427f7e188f" containerName="registry-server" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.268775 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.268849 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "3ef07e68-9334-463a-888f-1fd9fe3d3f1c" (UID: "3ef07e68-9334-463a-888f-1fd9fe3d3f1c"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.271259 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s1" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.271508 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s1" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.287995 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.299458 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "3ef07e68-9334-463a-888f-1fd9fe3d3f1c" (UID: "3ef07e68-9334-463a-888f-1fd9fe3d3f1c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.307447 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3ef07e68-9334-463a-888f-1fd9fe3d3f1c" (UID: "3ef07e68-9334-463a-888f-1fd9fe3d3f1c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.320463 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "3ef07e68-9334-463a-888f-1fd9fe3d3f1c" (UID: "3ef07e68-9334-463a-888f-1fd9fe3d3f1c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.322486 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "3ef07e68-9334-463a-888f-1fd9fe3d3f1c" (UID: "3ef07e68-9334-463a-888f-1fd9fe3d3f1c"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.355132 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.355257 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.355619 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.355729 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.355774 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.355841 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.355899 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.355934 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.356036 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vrtd\" (UniqueName: \"kubernetes.io/projected/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-kube-api-access-4vrtd\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.356094 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.356187 4913 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.356203 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.356215 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.356224 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.356236 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p44wb\" (UniqueName: \"kubernetes.io/projected/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-kube-api-access-p44wb\") on node \"crc\" DevicePath \"\"" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.356245 4913 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.356254 4913 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3ef07e68-9334-463a-888f-1fd9fe3d3f1c-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.388989 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.458372 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.458483 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.458553 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.458586 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.458641 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.458752 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vrtd\" (UniqueName: \"kubernetes.io/projected/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-kube-api-access-4vrtd\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.458819 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.458885 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.458907 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.458939 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.460176 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.460354 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.460952 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.463146 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.463777 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.464964 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.465919 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.477599 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vrtd\" (UniqueName: \"kubernetes.io/projected/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-kube-api-access-4vrtd\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.730078 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.773334 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"3ef07e68-9334-463a-888f-1fd9fe3d3f1c","Type":"ContainerDied","Data":"ca5bdeb82302ffd823ed9c23f3e647110192b97265fa7546d7e515d8b69e86b6"} Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.773647 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca5bdeb82302ffd823ed9c23f3e647110192b97265fa7546d7e515d8b69e86b6" Oct 01 14:08:21 crc kubenswrapper[4913]: I1001 14:08:21.773393 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Oct 01 14:08:22 crc kubenswrapper[4913]: I1001 14:08:22.262411 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Oct 01 14:08:22 crc kubenswrapper[4913]: I1001 14:08:22.787791 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"3a042f5f-1b7b-42fa-b8b4-db936848fbe9","Type":"ContainerStarted","Data":"bc9dc081f738cf205182211f6e597943fc5f9b8775525bdd47a66d4b9aae3abd"} Oct 01 14:08:23 crc kubenswrapper[4913]: I1001 14:08:23.799419 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"3a042f5f-1b7b-42fa-b8b4-db936848fbe9","Type":"ContainerStarted","Data":"e6744d81a2cd43ca4605770f472a69fc9e15c914e0249850213c07ff3b46662e"} Oct 01 14:08:23 crc kubenswrapper[4913]: I1001 14:08:23.819512 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s01-single-test" podStartSLOduration=2.819492012 podStartE2EDuration="2.819492012s" podCreationTimestamp="2025-10-01 14:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:08:23.817884447 +0000 UTC m=+5435.721360105" watchObservedRunningTime="2025-10-01 14:08:23.819492012 +0000 UTC m=+5435.722967600" Oct 01 14:08:40 crc kubenswrapper[4913]: I1001 14:08:40.084978 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:08:40 crc kubenswrapper[4913]: I1001 14:08:40.085901 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:08:40 crc kubenswrapper[4913]: I1001 14:08:40.085976 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 14:08:40 crc kubenswrapper[4913]: I1001 14:08:40.087178 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a168adc900961b11e5e23917c601c486864f29c09c1f4d1e5b3ce73a676338c"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:08:40 crc kubenswrapper[4913]: I1001 14:08:40.087293 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://2a168adc900961b11e5e23917c601c486864f29c09c1f4d1e5b3ce73a676338c" gracePeriod=600 Oct 01 14:08:40 crc kubenswrapper[4913]: I1001 14:08:40.954772 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="2a168adc900961b11e5e23917c601c486864f29c09c1f4d1e5b3ce73a676338c" exitCode=0 Oct 01 14:08:40 crc kubenswrapper[4913]: I1001 14:08:40.954848 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"2a168adc900961b11e5e23917c601c486864f29c09c1f4d1e5b3ce73a676338c"} Oct 01 14:08:40 crc kubenswrapper[4913]: I1001 14:08:40.955518 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb"} Oct 01 14:08:40 crc kubenswrapper[4913]: I1001 14:08:40.955543 4913 scope.go:117] "RemoveContainer" containerID="bd9cb3e08056eaa13dbd68e32a1a60fd5982e0d184fde9f6ff6ba69d9f4f4379" Oct 01 14:10:40 crc kubenswrapper[4913]: I1001 14:10:40.083945 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:10:40 crc kubenswrapper[4913]: I1001 14:10:40.084461 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:11:05 crc kubenswrapper[4913]: I1001 14:11:05.244414 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2jv2r"] Oct 01 14:11:05 crc kubenswrapper[4913]: I1001 14:11:05.249597 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:05 crc kubenswrapper[4913]: I1001 14:11:05.260013 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2jv2r"] Oct 01 14:11:05 crc kubenswrapper[4913]: I1001 14:11:05.305466 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3435690d-dbe1-4e58-a81c-d2c158705df3-utilities\") pod \"redhat-operators-2jv2r\" (UID: \"3435690d-dbe1-4e58-a81c-d2c158705df3\") " pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:05 crc kubenswrapper[4913]: I1001 14:11:05.305514 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kjvj\" (UniqueName: \"kubernetes.io/projected/3435690d-dbe1-4e58-a81c-d2c158705df3-kube-api-access-4kjvj\") pod \"redhat-operators-2jv2r\" (UID: \"3435690d-dbe1-4e58-a81c-d2c158705df3\") " pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:05 crc kubenswrapper[4913]: I1001 14:11:05.305715 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3435690d-dbe1-4e58-a81c-d2c158705df3-catalog-content\") pod \"redhat-operators-2jv2r\" (UID: \"3435690d-dbe1-4e58-a81c-d2c158705df3\") " pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:05 crc kubenswrapper[4913]: I1001 14:11:05.407387 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3435690d-dbe1-4e58-a81c-d2c158705df3-utilities\") pod \"redhat-operators-2jv2r\" (UID: \"3435690d-dbe1-4e58-a81c-d2c158705df3\") " pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:05 crc kubenswrapper[4913]: I1001 14:11:05.407445 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kjvj\" (UniqueName: \"kubernetes.io/projected/3435690d-dbe1-4e58-a81c-d2c158705df3-kube-api-access-4kjvj\") pod \"redhat-operators-2jv2r\" (UID: \"3435690d-dbe1-4e58-a81c-d2c158705df3\") " pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:05 crc kubenswrapper[4913]: I1001 14:11:05.407639 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3435690d-dbe1-4e58-a81c-d2c158705df3-catalog-content\") pod \"redhat-operators-2jv2r\" (UID: \"3435690d-dbe1-4e58-a81c-d2c158705df3\") " pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:05 crc kubenswrapper[4913]: I1001 14:11:05.407883 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3435690d-dbe1-4e58-a81c-d2c158705df3-utilities\") pod \"redhat-operators-2jv2r\" (UID: \"3435690d-dbe1-4e58-a81c-d2c158705df3\") " pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:05 crc kubenswrapper[4913]: I1001 14:11:05.408154 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3435690d-dbe1-4e58-a81c-d2c158705df3-catalog-content\") pod \"redhat-operators-2jv2r\" (UID: \"3435690d-dbe1-4e58-a81c-d2c158705df3\") " pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:05 crc kubenswrapper[4913]: I1001 14:11:05.436859 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kjvj\" (UniqueName: \"kubernetes.io/projected/3435690d-dbe1-4e58-a81c-d2c158705df3-kube-api-access-4kjvj\") pod \"redhat-operators-2jv2r\" (UID: \"3435690d-dbe1-4e58-a81c-d2c158705df3\") " pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:05 crc kubenswrapper[4913]: I1001 14:11:05.589716 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:06 crc kubenswrapper[4913]: I1001 14:11:06.040881 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2jv2r"] Oct 01 14:11:06 crc kubenswrapper[4913]: I1001 14:11:06.288427 4913 generic.go:334] "Generic (PLEG): container finished" podID="3435690d-dbe1-4e58-a81c-d2c158705df3" containerID="c6b9acbc03741f5d53d5cc09e0994531856a69fcb847f58ebdaf5ea562bbd045" exitCode=0 Oct 01 14:11:06 crc kubenswrapper[4913]: I1001 14:11:06.288508 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jv2r" event={"ID":"3435690d-dbe1-4e58-a81c-d2c158705df3","Type":"ContainerDied","Data":"c6b9acbc03741f5d53d5cc09e0994531856a69fcb847f58ebdaf5ea562bbd045"} Oct 01 14:11:06 crc kubenswrapper[4913]: I1001 14:11:06.288742 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jv2r" event={"ID":"3435690d-dbe1-4e58-a81c-d2c158705df3","Type":"ContainerStarted","Data":"7ad7dd5971d686e230c48dbd9a246ffc51cfa5e084ba127355885b1c85d4acb7"} Oct 01 14:11:06 crc kubenswrapper[4913]: I1001 14:11:06.292379 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:11:10 crc kubenswrapper[4913]: I1001 14:11:10.084029 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:11:10 crc kubenswrapper[4913]: I1001 14:11:10.084742 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:11:11 crc kubenswrapper[4913]: I1001 14:11:11.339703 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jv2r" event={"ID":"3435690d-dbe1-4e58-a81c-d2c158705df3","Type":"ContainerStarted","Data":"91f477e8ce948a600f372a526cb1e44fe7044a1513aea10e8988a190c09575b1"} Oct 01 14:11:12 crc kubenswrapper[4913]: I1001 14:11:12.355339 4913 generic.go:334] "Generic (PLEG): container finished" podID="3435690d-dbe1-4e58-a81c-d2c158705df3" containerID="91f477e8ce948a600f372a526cb1e44fe7044a1513aea10e8988a190c09575b1" exitCode=0 Oct 01 14:11:12 crc kubenswrapper[4913]: I1001 14:11:12.355639 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jv2r" event={"ID":"3435690d-dbe1-4e58-a81c-d2c158705df3","Type":"ContainerDied","Data":"91f477e8ce948a600f372a526cb1e44fe7044a1513aea10e8988a190c09575b1"} Oct 01 14:11:13 crc kubenswrapper[4913]: I1001 14:11:13.369772 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jv2r" event={"ID":"3435690d-dbe1-4e58-a81c-d2c158705df3","Type":"ContainerStarted","Data":"5e0ea75f14a4c923f0a2f9acb197c79eadddad1250343061c1c54051b8ad8cf6"} Oct 01 14:11:13 crc kubenswrapper[4913]: I1001 14:11:13.389573 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2jv2r" podStartSLOduration=1.835478358 podStartE2EDuration="8.389556352s" podCreationTimestamp="2025-10-01 14:11:05 +0000 UTC" firstStartedPulling="2025-10-01 14:11:06.292076745 +0000 UTC m=+5598.195552323" lastFinishedPulling="2025-10-01 14:11:12.846154739 +0000 UTC m=+5604.749630317" observedRunningTime="2025-10-01 14:11:13.385915274 +0000 UTC m=+5605.289390882" watchObservedRunningTime="2025-10-01 14:11:13.389556352 +0000 UTC m=+5605.293031920" Oct 01 14:11:15 crc kubenswrapper[4913]: I1001 14:11:15.590299 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:15 crc kubenswrapper[4913]: I1001 14:11:15.590904 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:16 crc kubenswrapper[4913]: I1001 14:11:16.635914 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2jv2r" podUID="3435690d-dbe1-4e58-a81c-d2c158705df3" containerName="registry-server" probeResult="failure" output=< Oct 01 14:11:16 crc kubenswrapper[4913]: timeout: failed to connect service ":50051" within 1s Oct 01 14:11:16 crc kubenswrapper[4913]: > Oct 01 14:11:25 crc kubenswrapper[4913]: I1001 14:11:25.639662 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:25 crc kubenswrapper[4913]: I1001 14:11:25.686431 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:25 crc kubenswrapper[4913]: I1001 14:11:25.871627 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2jv2r"] Oct 01 14:11:27 crc kubenswrapper[4913]: I1001 14:11:27.500060 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2jv2r" podUID="3435690d-dbe1-4e58-a81c-d2c158705df3" containerName="registry-server" containerID="cri-o://5e0ea75f14a4c923f0a2f9acb197c79eadddad1250343061c1c54051b8ad8cf6" gracePeriod=2 Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.487951 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.510514 4913 generic.go:334] "Generic (PLEG): container finished" podID="3435690d-dbe1-4e58-a81c-d2c158705df3" containerID="5e0ea75f14a4c923f0a2f9acb197c79eadddad1250343061c1c54051b8ad8cf6" exitCode=0 Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.510557 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jv2r" event={"ID":"3435690d-dbe1-4e58-a81c-d2c158705df3","Type":"ContainerDied","Data":"5e0ea75f14a4c923f0a2f9acb197c79eadddad1250343061c1c54051b8ad8cf6"} Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.510581 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2jv2r" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.510632 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jv2r" event={"ID":"3435690d-dbe1-4e58-a81c-d2c158705df3","Type":"ContainerDied","Data":"7ad7dd5971d686e230c48dbd9a246ffc51cfa5e084ba127355885b1c85d4acb7"} Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.510659 4913 scope.go:117] "RemoveContainer" containerID="5e0ea75f14a4c923f0a2f9acb197c79eadddad1250343061c1c54051b8ad8cf6" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.554550 4913 scope.go:117] "RemoveContainer" containerID="91f477e8ce948a600f372a526cb1e44fe7044a1513aea10e8988a190c09575b1" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.575062 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3435690d-dbe1-4e58-a81c-d2c158705df3-utilities\") pod \"3435690d-dbe1-4e58-a81c-d2c158705df3\" (UID: \"3435690d-dbe1-4e58-a81c-d2c158705df3\") " Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.575311 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kjvj\" (UniqueName: \"kubernetes.io/projected/3435690d-dbe1-4e58-a81c-d2c158705df3-kube-api-access-4kjvj\") pod \"3435690d-dbe1-4e58-a81c-d2c158705df3\" (UID: \"3435690d-dbe1-4e58-a81c-d2c158705df3\") " Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.575456 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3435690d-dbe1-4e58-a81c-d2c158705df3-catalog-content\") pod \"3435690d-dbe1-4e58-a81c-d2c158705df3\" (UID: \"3435690d-dbe1-4e58-a81c-d2c158705df3\") " Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.576190 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3435690d-dbe1-4e58-a81c-d2c158705df3-utilities" (OuterVolumeSpecName: "utilities") pod "3435690d-dbe1-4e58-a81c-d2c158705df3" (UID: "3435690d-dbe1-4e58-a81c-d2c158705df3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.580397 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3435690d-dbe1-4e58-a81c-d2c158705df3-kube-api-access-4kjvj" (OuterVolumeSpecName: "kube-api-access-4kjvj") pod "3435690d-dbe1-4e58-a81c-d2c158705df3" (UID: "3435690d-dbe1-4e58-a81c-d2c158705df3"). InnerVolumeSpecName "kube-api-access-4kjvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.594767 4913 scope.go:117] "RemoveContainer" containerID="c6b9acbc03741f5d53d5cc09e0994531856a69fcb847f58ebdaf5ea562bbd045" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.650452 4913 scope.go:117] "RemoveContainer" containerID="5e0ea75f14a4c923f0a2f9acb197c79eadddad1250343061c1c54051b8ad8cf6" Oct 01 14:11:28 crc kubenswrapper[4913]: E1001 14:11:28.650975 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e0ea75f14a4c923f0a2f9acb197c79eadddad1250343061c1c54051b8ad8cf6\": container with ID starting with 5e0ea75f14a4c923f0a2f9acb197c79eadddad1250343061c1c54051b8ad8cf6 not found: ID does not exist" containerID="5e0ea75f14a4c923f0a2f9acb197c79eadddad1250343061c1c54051b8ad8cf6" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.651046 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e0ea75f14a4c923f0a2f9acb197c79eadddad1250343061c1c54051b8ad8cf6"} err="failed to get container status \"5e0ea75f14a4c923f0a2f9acb197c79eadddad1250343061c1c54051b8ad8cf6\": rpc error: code = NotFound desc = could not find container \"5e0ea75f14a4c923f0a2f9acb197c79eadddad1250343061c1c54051b8ad8cf6\": container with ID starting with 5e0ea75f14a4c923f0a2f9acb197c79eadddad1250343061c1c54051b8ad8cf6 not found: ID does not exist" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.651081 4913 scope.go:117] "RemoveContainer" containerID="91f477e8ce948a600f372a526cb1e44fe7044a1513aea10e8988a190c09575b1" Oct 01 14:11:28 crc kubenswrapper[4913]: E1001 14:11:28.651625 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91f477e8ce948a600f372a526cb1e44fe7044a1513aea10e8988a190c09575b1\": container with ID starting with 91f477e8ce948a600f372a526cb1e44fe7044a1513aea10e8988a190c09575b1 not found: ID does not exist" containerID="91f477e8ce948a600f372a526cb1e44fe7044a1513aea10e8988a190c09575b1" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.651659 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91f477e8ce948a600f372a526cb1e44fe7044a1513aea10e8988a190c09575b1"} err="failed to get container status \"91f477e8ce948a600f372a526cb1e44fe7044a1513aea10e8988a190c09575b1\": rpc error: code = NotFound desc = could not find container \"91f477e8ce948a600f372a526cb1e44fe7044a1513aea10e8988a190c09575b1\": container with ID starting with 91f477e8ce948a600f372a526cb1e44fe7044a1513aea10e8988a190c09575b1 not found: ID does not exist" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.651679 4913 scope.go:117] "RemoveContainer" containerID="c6b9acbc03741f5d53d5cc09e0994531856a69fcb847f58ebdaf5ea562bbd045" Oct 01 14:11:28 crc kubenswrapper[4913]: E1001 14:11:28.652098 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b9acbc03741f5d53d5cc09e0994531856a69fcb847f58ebdaf5ea562bbd045\": container with ID starting with c6b9acbc03741f5d53d5cc09e0994531856a69fcb847f58ebdaf5ea562bbd045 not found: ID does not exist" containerID="c6b9acbc03741f5d53d5cc09e0994531856a69fcb847f58ebdaf5ea562bbd045" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.652150 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b9acbc03741f5d53d5cc09e0994531856a69fcb847f58ebdaf5ea562bbd045"} err="failed to get container status \"c6b9acbc03741f5d53d5cc09e0994531856a69fcb847f58ebdaf5ea562bbd045\": rpc error: code = NotFound desc = could not find container \"c6b9acbc03741f5d53d5cc09e0994531856a69fcb847f58ebdaf5ea562bbd045\": container with ID starting with c6b9acbc03741f5d53d5cc09e0994531856a69fcb847f58ebdaf5ea562bbd045 not found: ID does not exist" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.662320 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3435690d-dbe1-4e58-a81c-d2c158705df3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3435690d-dbe1-4e58-a81c-d2c158705df3" (UID: "3435690d-dbe1-4e58-a81c-d2c158705df3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.677990 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3435690d-dbe1-4e58-a81c-d2c158705df3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.678061 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3435690d-dbe1-4e58-a81c-d2c158705df3-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.678075 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kjvj\" (UniqueName: \"kubernetes.io/projected/3435690d-dbe1-4e58-a81c-d2c158705df3-kube-api-access-4kjvj\") on node \"crc\" DevicePath \"\"" Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.845834 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2jv2r"] Oct 01 14:11:28 crc kubenswrapper[4913]: I1001 14:11:28.853889 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2jv2r"] Oct 01 14:11:30 crc kubenswrapper[4913]: I1001 14:11:30.823730 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3435690d-dbe1-4e58-a81c-d2c158705df3" path="/var/lib/kubelet/pods/3435690d-dbe1-4e58-a81c-d2c158705df3/volumes" Oct 01 14:11:40 crc kubenswrapper[4913]: I1001 14:11:40.083483 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:11:40 crc kubenswrapper[4913]: I1001 14:11:40.086327 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:11:40 crc kubenswrapper[4913]: I1001 14:11:40.086539 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 14:11:40 crc kubenswrapper[4913]: I1001 14:11:40.087933 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:11:40 crc kubenswrapper[4913]: I1001 14:11:40.088371 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" gracePeriod=600 Oct 01 14:11:40 crc kubenswrapper[4913]: E1001 14:11:40.210143 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:11:40 crc kubenswrapper[4913]: I1001 14:11:40.607338 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" exitCode=0 Oct 01 14:11:40 crc kubenswrapper[4913]: I1001 14:11:40.607518 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb"} Oct 01 14:11:40 crc kubenswrapper[4913]: I1001 14:11:40.608908 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:11:40 crc kubenswrapper[4913]: I1001 14:11:40.608971 4913 scope.go:117] "RemoveContainer" containerID="2a168adc900961b11e5e23917c601c486864f29c09c1f4d1e5b3ce73a676338c" Oct 01 14:11:40 crc kubenswrapper[4913]: E1001 14:11:40.609619 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:11:52 crc kubenswrapper[4913]: I1001 14:11:52.806636 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:11:52 crc kubenswrapper[4913]: E1001 14:11:52.807492 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:12:05 crc kubenswrapper[4913]: I1001 14:12:05.806987 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:12:05 crc kubenswrapper[4913]: E1001 14:12:05.807901 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:12:16 crc kubenswrapper[4913]: I1001 14:12:16.806916 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:12:16 crc kubenswrapper[4913]: E1001 14:12:16.807922 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:12:28 crc kubenswrapper[4913]: I1001 14:12:28.814766 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:12:28 crc kubenswrapper[4913]: E1001 14:12:28.815405 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:12:40 crc kubenswrapper[4913]: I1001 14:12:40.806546 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:12:40 crc kubenswrapper[4913]: E1001 14:12:40.807430 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:12:48 crc kubenswrapper[4913]: I1001 14:12:48.246777 4913 generic.go:334] "Generic (PLEG): container finished" podID="3a042f5f-1b7b-42fa-b8b4-db936848fbe9" containerID="e6744d81a2cd43ca4605770f472a69fc9e15c914e0249850213c07ff3b46662e" exitCode=0 Oct 01 14:12:48 crc kubenswrapper[4913]: I1001 14:12:48.246922 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"3a042f5f-1b7b-42fa-b8b4-db936848fbe9","Type":"ContainerDied","Data":"e6744d81a2cd43ca4605770f472a69fc9e15c914e0249850213c07ff3b46662e"} Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.728478 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.898636 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-openstack-config\") pod \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.898767 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ca-certs\") pod \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.898801 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vrtd\" (UniqueName: \"kubernetes.io/projected/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-kube-api-access-4vrtd\") pod \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.898824 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ssh-key\") pod \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.898855 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-test-operator-ephemeral-temporary\") pod \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.898879 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.898912 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-config-data\") pod \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.898977 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ceph\") pod \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.899032 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-test-operator-ephemeral-workdir\") pod \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.899056 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-openstack-config-secret\") pod \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\" (UID: \"3a042f5f-1b7b-42fa-b8b4-db936848fbe9\") " Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.899783 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "3a042f5f-1b7b-42fa-b8b4-db936848fbe9" (UID: "3a042f5f-1b7b-42fa-b8b4-db936848fbe9"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.900006 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-config-data" (OuterVolumeSpecName: "config-data") pod "3a042f5f-1b7b-42fa-b8b4-db936848fbe9" (UID: "3a042f5f-1b7b-42fa-b8b4-db936848fbe9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.900583 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.900621 4913 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.906473 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "3a042f5f-1b7b-42fa-b8b4-db936848fbe9" (UID: "3a042f5f-1b7b-42fa-b8b4-db936848fbe9"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.915621 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ceph" (OuterVolumeSpecName: "ceph") pod "3a042f5f-1b7b-42fa-b8b4-db936848fbe9" (UID: "3a042f5f-1b7b-42fa-b8b4-db936848fbe9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.916701 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-kube-api-access-4vrtd" (OuterVolumeSpecName: "kube-api-access-4vrtd") pod "3a042f5f-1b7b-42fa-b8b4-db936848fbe9" (UID: "3a042f5f-1b7b-42fa-b8b4-db936848fbe9"). InnerVolumeSpecName "kube-api-access-4vrtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.916915 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "3a042f5f-1b7b-42fa-b8b4-db936848fbe9" (UID: "3a042f5f-1b7b-42fa-b8b4-db936848fbe9"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.928170 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3a042f5f-1b7b-42fa-b8b4-db936848fbe9" (UID: "3a042f5f-1b7b-42fa-b8b4-db936848fbe9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.932931 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "3a042f5f-1b7b-42fa-b8b4-db936848fbe9" (UID: "3a042f5f-1b7b-42fa-b8b4-db936848fbe9"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.941923 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "3a042f5f-1b7b-42fa-b8b4-db936848fbe9" (UID: "3a042f5f-1b7b-42fa-b8b4-db936848fbe9"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:12:49 crc kubenswrapper[4913]: I1001 14:12:49.961145 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "3a042f5f-1b7b-42fa-b8b4-db936848fbe9" (UID: "3a042f5f-1b7b-42fa-b8b4-db936848fbe9"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:12:50 crc kubenswrapper[4913]: I1001 14:12:50.002516 4913 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:50 crc kubenswrapper[4913]: I1001 14:12:50.002569 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vrtd\" (UniqueName: \"kubernetes.io/projected/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-kube-api-access-4vrtd\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:50 crc kubenswrapper[4913]: I1001 14:12:50.002587 4913 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:50 crc kubenswrapper[4913]: I1001 14:12:50.002625 4913 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 01 14:12:50 crc kubenswrapper[4913]: I1001 14:12:50.002635 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:50 crc kubenswrapper[4913]: I1001 14:12:50.002644 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:50 crc kubenswrapper[4913]: I1001 14:12:50.002653 4913 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:50 crc kubenswrapper[4913]: I1001 14:12:50.002663 4913 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3a042f5f-1b7b-42fa-b8b4-db936848fbe9-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:50 crc kubenswrapper[4913]: I1001 14:12:50.022280 4913 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 01 14:12:50 crc kubenswrapper[4913]: I1001 14:12:50.104359 4913 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:50 crc kubenswrapper[4913]: I1001 14:12:50.266764 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"3a042f5f-1b7b-42fa-b8b4-db936848fbe9","Type":"ContainerDied","Data":"bc9dc081f738cf205182211f6e597943fc5f9b8775525bdd47a66d4b9aae3abd"} Oct 01 14:12:50 crc kubenswrapper[4913]: I1001 14:12:50.267925 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc9dc081f738cf205182211f6e597943fc5f9b8775525bdd47a66d4b9aae3abd" Oct 01 14:12:50 crc kubenswrapper[4913]: I1001 14:12:50.266808 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Oct 01 14:12:54 crc kubenswrapper[4913]: I1001 14:12:54.806672 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:12:54 crc kubenswrapper[4913]: E1001 14:12:54.807386 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.632807 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 14:12:58 crc kubenswrapper[4913]: E1001 14:12:58.633885 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a042f5f-1b7b-42fa-b8b4-db936848fbe9" containerName="tempest-tests-tempest-tests-runner" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.633898 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a042f5f-1b7b-42fa-b8b4-db936848fbe9" containerName="tempest-tests-tempest-tests-runner" Oct 01 14:12:58 crc kubenswrapper[4913]: E1001 14:12:58.633921 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3435690d-dbe1-4e58-a81c-d2c158705df3" containerName="extract-content" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.633927 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3435690d-dbe1-4e58-a81c-d2c158705df3" containerName="extract-content" Oct 01 14:12:58 crc kubenswrapper[4913]: E1001 14:12:58.633978 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3435690d-dbe1-4e58-a81c-d2c158705df3" containerName="extract-utilities" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.633989 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3435690d-dbe1-4e58-a81c-d2c158705df3" containerName="extract-utilities" Oct 01 14:12:58 crc kubenswrapper[4913]: E1001 14:12:58.634005 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3435690d-dbe1-4e58-a81c-d2c158705df3" containerName="registry-server" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.634013 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3435690d-dbe1-4e58-a81c-d2c158705df3" containerName="registry-server" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.634303 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="3435690d-dbe1-4e58-a81c-d2c158705df3" containerName="registry-server" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.634319 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a042f5f-1b7b-42fa-b8b4-db936848fbe9" containerName="tempest-tests-tempest-tests-runner" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.643097 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.645760 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qb4x7" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.647653 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.693770 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prgpf\" (UniqueName: \"kubernetes.io/projected/8a0f7457-8032-4fb8-8784-126cad8b13a8-kube-api-access-prgpf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8a0f7457-8032-4fb8-8784-126cad8b13a8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.693829 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8a0f7457-8032-4fb8-8784-126cad8b13a8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.796250 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prgpf\" (UniqueName: \"kubernetes.io/projected/8a0f7457-8032-4fb8-8784-126cad8b13a8-kube-api-access-prgpf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8a0f7457-8032-4fb8-8784-126cad8b13a8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.796392 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8a0f7457-8032-4fb8-8784-126cad8b13a8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.796955 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8a0f7457-8032-4fb8-8784-126cad8b13a8\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.817820 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prgpf\" (UniqueName: \"kubernetes.io/projected/8a0f7457-8032-4fb8-8784-126cad8b13a8-kube-api-access-prgpf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8a0f7457-8032-4fb8-8784-126cad8b13a8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.836707 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8a0f7457-8032-4fb8-8784-126cad8b13a8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:12:58 crc kubenswrapper[4913]: I1001 14:12:58.966421 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:12:59 crc kubenswrapper[4913]: I1001 14:12:59.404163 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 14:13:00 crc kubenswrapper[4913]: I1001 14:13:00.369187 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"8a0f7457-8032-4fb8-8784-126cad8b13a8","Type":"ContainerStarted","Data":"190b65cd03959468d055992a8d0173882739d8c8723d1d8fe1be853025cd4f39"} Oct 01 14:13:01 crc kubenswrapper[4913]: I1001 14:13:01.384571 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"8a0f7457-8032-4fb8-8784-126cad8b13a8","Type":"ContainerStarted","Data":"98b5e707fbf3105a8853781863b700cf42040b9fff887385d6cc75710ebc169f"} Oct 01 14:13:01 crc kubenswrapper[4913]: I1001 14:13:01.400158 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.3178456020000002 podStartE2EDuration="3.400130406s" podCreationTimestamp="2025-10-01 14:12:58 +0000 UTC" firstStartedPulling="2025-10-01 14:12:59.414454095 +0000 UTC m=+5711.317929673" lastFinishedPulling="2025-10-01 14:13:00.496738899 +0000 UTC m=+5712.400214477" observedRunningTime="2025-10-01 14:13:01.39840461 +0000 UTC m=+5713.301880268" watchObservedRunningTime="2025-10-01 14:13:01.400130406 +0000 UTC m=+5713.303606024" Oct 01 14:13:05 crc kubenswrapper[4913]: I1001 14:13:05.807874 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:13:05 crc kubenswrapper[4913]: E1001 14:13:05.808418 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:13:19 crc kubenswrapper[4913]: I1001 14:13:19.807103 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:13:19 crc kubenswrapper[4913]: E1001 14:13:19.809098 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.076650 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.078694 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.081545 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"test-operator-clouds-config" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.082052 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"tobiko-secret" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.082314 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobikotobiko-config" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.082390 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobikotobiko-private-key" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.086382 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobikotobiko-public-key" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.101683 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.237999 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fz2x\" (UniqueName: \"kubernetes.io/projected/567cfe78-d6e0-4a12-99d7-a280aeb55e68-kube-api-access-7fz2x\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.238068 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.238176 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.238214 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.238255 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.238544 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.238630 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.238694 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.238767 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.238815 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.238855 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.239028 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.340720 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.340784 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fz2x\" (UniqueName: \"kubernetes.io/projected/567cfe78-d6e0-4a12-99d7-a280aeb55e68-kube-api-access-7fz2x\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.340821 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.340851 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.340890 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.340922 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.340985 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.341008 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.341028 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.341049 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.341066 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.341085 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.341486 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.341533 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.341800 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.342149 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.342186 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.342534 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.342600 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.348906 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.349707 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.350415 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.357639 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.358709 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fz2x\" (UniqueName: \"kubernetes.io/projected/567cfe78-d6e0-4a12-99d7-a280aeb55e68-kube-api-access-7fz2x\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.371464 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.402140 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:13:24 crc kubenswrapper[4913]: I1001 14:13:24.926877 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Oct 01 14:13:25 crc kubenswrapper[4913]: I1001 14:13:25.585692 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"567cfe78-d6e0-4a12-99d7-a280aeb55e68","Type":"ContainerStarted","Data":"dd5e0779a649969f08d197bf11a51b8fa2e5222b0b705d07d732576da3d0785d"} Oct 01 14:13:30 crc kubenswrapper[4913]: I1001 14:13:30.808083 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:13:30 crc kubenswrapper[4913]: E1001 14:13:30.809003 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:13:41 crc kubenswrapper[4913]: I1001 14:13:41.806494 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:13:41 crc kubenswrapper[4913]: E1001 14:13:41.807083 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:13:42 crc kubenswrapper[4913]: I1001 14:13:42.657147 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hwddd"] Oct 01 14:13:42 crc kubenswrapper[4913]: I1001 14:13:42.660854 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:42 crc kubenswrapper[4913]: I1001 14:13:42.674599 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hwddd"] Oct 01 14:13:42 crc kubenswrapper[4913]: I1001 14:13:42.796924 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab45de02-573f-432e-9105-eb56f9b08e1f-catalog-content\") pod \"certified-operators-hwddd\" (UID: \"ab45de02-573f-432e-9105-eb56f9b08e1f\") " pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:42 crc kubenswrapper[4913]: I1001 14:13:42.796980 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8c59\" (UniqueName: \"kubernetes.io/projected/ab45de02-573f-432e-9105-eb56f9b08e1f-kube-api-access-n8c59\") pod \"certified-operators-hwddd\" (UID: \"ab45de02-573f-432e-9105-eb56f9b08e1f\") " pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:42 crc kubenswrapper[4913]: I1001 14:13:42.797008 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab45de02-573f-432e-9105-eb56f9b08e1f-utilities\") pod \"certified-operators-hwddd\" (UID: \"ab45de02-573f-432e-9105-eb56f9b08e1f\") " pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:42 crc kubenswrapper[4913]: I1001 14:13:42.899139 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab45de02-573f-432e-9105-eb56f9b08e1f-catalog-content\") pod \"certified-operators-hwddd\" (UID: \"ab45de02-573f-432e-9105-eb56f9b08e1f\") " pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:42 crc kubenswrapper[4913]: I1001 14:13:42.899200 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8c59\" (UniqueName: \"kubernetes.io/projected/ab45de02-573f-432e-9105-eb56f9b08e1f-kube-api-access-n8c59\") pod \"certified-operators-hwddd\" (UID: \"ab45de02-573f-432e-9105-eb56f9b08e1f\") " pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:42 crc kubenswrapper[4913]: I1001 14:13:42.899226 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab45de02-573f-432e-9105-eb56f9b08e1f-utilities\") pod \"certified-operators-hwddd\" (UID: \"ab45de02-573f-432e-9105-eb56f9b08e1f\") " pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:42 crc kubenswrapper[4913]: I1001 14:13:42.900497 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab45de02-573f-432e-9105-eb56f9b08e1f-catalog-content\") pod \"certified-operators-hwddd\" (UID: \"ab45de02-573f-432e-9105-eb56f9b08e1f\") " pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:42 crc kubenswrapper[4913]: I1001 14:13:42.901049 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab45de02-573f-432e-9105-eb56f9b08e1f-utilities\") pod \"certified-operators-hwddd\" (UID: \"ab45de02-573f-432e-9105-eb56f9b08e1f\") " pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:42 crc kubenswrapper[4913]: I1001 14:13:42.926554 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8c59\" (UniqueName: \"kubernetes.io/projected/ab45de02-573f-432e-9105-eb56f9b08e1f-kube-api-access-n8c59\") pod \"certified-operators-hwddd\" (UID: \"ab45de02-573f-432e-9105-eb56f9b08e1f\") " pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:42 crc kubenswrapper[4913]: I1001 14:13:42.990749 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:43 crc kubenswrapper[4913]: I1001 14:13:43.489089 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hwddd"] Oct 01 14:13:43 crc kubenswrapper[4913]: I1001 14:13:43.739297 4913 generic.go:334] "Generic (PLEG): container finished" podID="ab45de02-573f-432e-9105-eb56f9b08e1f" containerID="e9d6296cd933a1ad779257cd9cc04fc0c45b653db662ba1ae063b3c50bab4b8c" exitCode=0 Oct 01 14:13:43 crc kubenswrapper[4913]: I1001 14:13:43.739405 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwddd" event={"ID":"ab45de02-573f-432e-9105-eb56f9b08e1f","Type":"ContainerDied","Data":"e9d6296cd933a1ad779257cd9cc04fc0c45b653db662ba1ae063b3c50bab4b8c"} Oct 01 14:13:43 crc kubenswrapper[4913]: I1001 14:13:43.739620 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwddd" event={"ID":"ab45de02-573f-432e-9105-eb56f9b08e1f","Type":"ContainerStarted","Data":"2960e8e3d50bc464053793c0f420d6e4232525ca9fe7c25afa656dd67770dc4f"} Oct 01 14:13:45 crc kubenswrapper[4913]: I1001 14:13:45.824481 4913 generic.go:334] "Generic (PLEG): container finished" podID="ab45de02-573f-432e-9105-eb56f9b08e1f" containerID="5111877ea8543af2fc4d712930e4bf20c3c05308a70ba6c088d246a054ea4773" exitCode=0 Oct 01 14:13:45 crc kubenswrapper[4913]: I1001 14:13:45.824551 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwddd" event={"ID":"ab45de02-573f-432e-9105-eb56f9b08e1f","Type":"ContainerDied","Data":"5111877ea8543af2fc4d712930e4bf20c3c05308a70ba6c088d246a054ea4773"} Oct 01 14:13:46 crc kubenswrapper[4913]: I1001 14:13:46.836386 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwddd" event={"ID":"ab45de02-573f-432e-9105-eb56f9b08e1f","Type":"ContainerStarted","Data":"6caa2a2b41359e9b221e24c3aa727f821ccba07c636d5d45cfe51bea2cec442f"} Oct 01 14:13:46 crc kubenswrapper[4913]: I1001 14:13:46.867545 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hwddd" podStartSLOduration=2.161994506 podStartE2EDuration="4.867516773s" podCreationTimestamp="2025-10-01 14:13:42 +0000 UTC" firstStartedPulling="2025-10-01 14:13:43.741040841 +0000 UTC m=+5755.644516419" lastFinishedPulling="2025-10-01 14:13:46.446563108 +0000 UTC m=+5758.350038686" observedRunningTime="2025-10-01 14:13:46.853392111 +0000 UTC m=+5758.756867689" watchObservedRunningTime="2025-10-01 14:13:46.867516773 +0000 UTC m=+5758.770992381" Oct 01 14:13:52 crc kubenswrapper[4913]: I1001 14:13:52.990905 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:52 crc kubenswrapper[4913]: I1001 14:13:52.991535 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:53 crc kubenswrapper[4913]: I1001 14:13:53.044056 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:53 crc kubenswrapper[4913]: I1001 14:13:53.952054 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:13:54 crc kubenswrapper[4913]: I1001 14:13:54.003365 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hwddd"] Oct 01 14:13:54 crc kubenswrapper[4913]: I1001 14:13:54.806705 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:13:54 crc kubenswrapper[4913]: E1001 14:13:54.807214 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:13:55 crc kubenswrapper[4913]: I1001 14:13:55.920169 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hwddd" podUID="ab45de02-573f-432e-9105-eb56f9b08e1f" containerName="registry-server" containerID="cri-o://6caa2a2b41359e9b221e24c3aa727f821ccba07c636d5d45cfe51bea2cec442f" gracePeriod=2 Oct 01 14:13:56 crc kubenswrapper[4913]: I1001 14:13:56.930325 4913 generic.go:334] "Generic (PLEG): container finished" podID="ab45de02-573f-432e-9105-eb56f9b08e1f" containerID="6caa2a2b41359e9b221e24c3aa727f821ccba07c636d5d45cfe51bea2cec442f" exitCode=0 Oct 01 14:13:56 crc kubenswrapper[4913]: I1001 14:13:56.930404 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwddd" event={"ID":"ab45de02-573f-432e-9105-eb56f9b08e1f","Type":"ContainerDied","Data":"6caa2a2b41359e9b221e24c3aa727f821ccba07c636d5d45cfe51bea2cec442f"} Oct 01 14:14:01 crc kubenswrapper[4913]: E1001 14:14:01.690797 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tobiko:current-podified" Oct 01 14:14:01 crc kubenswrapper[4913]: E1001 14:14:01.691666 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tobiko-tests-tobiko,Image:quay.io/podified-antelope-centos9/openstack-tobiko:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TOBIKO_DEBUG_MODE,Value:false,ValueFrom:nil,},EnvVar{Name:TOBIKO_KEYS_FOLDER,Value:/etc/test_operator,ValueFrom:nil,},EnvVar{Name:TOBIKO_LOGS_DIR_NAME,Value:tobiko-tests-tobiko-s00-podified-functional,ValueFrom:nil,},EnvVar{Name:TOBIKO_PYTEST_ADDOPTS,Value:,ValueFrom:nil,},EnvVar{Name:TOBIKO_TESTENV,Value:functional -- tobiko/tests/functional/podified/test_topology.py,ValueFrom:nil,},EnvVar{Name:TOBIKO_VERSION,Value:master,ValueFrom:nil,},EnvVar{Name:TOX_NUM_PROCESSES,Value:2,ValueFrom:nil,},EnvVar{Name:USE_EXTERNAL_FILES,Value:True,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{8 0} {} 8 DecimalSI},memory: {{8589934592 0} {} BinarySI},},Requests:ResourceList{cpu: {{4 0} {} 4 DecimalSI},memory: {{4294967296 0} {} 4Gi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tobiko,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tobiko/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-clouds-config,ReadOnly:true,MountPath:/var/lib/tobiko/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-clouds-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tobiko-config,ReadOnly:false,MountPath:/etc/tobiko/tobiko.conf,SubPath:tobiko.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ca-bundle.trust.crt,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tobiko-private-key,ReadOnly:true,MountPath:/etc/test_operator/id_ecdsa,SubPath:id_ecdsa,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tobiko-public-key,ReadOnly:true,MountPath:/etc/test_operator/id_ecdsa.pub,SubPath:id_ecdsa.pub,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kubeconfig,ReadOnly:true,MountPath:/var/lib/tobiko/.kube/config,SubPath:config,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7fz2x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42495,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42495,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tobiko-tests-tobiko-s00-podified-functional_openstack(567cfe78-d6e0-4a12-99d7-a280aeb55e68): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 14:14:01 crc kubenswrapper[4913]: E1001 14:14:01.692965 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tobiko-tests-tobiko\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" podUID="567cfe78-d6e0-4a12-99d7-a280aeb55e68" Oct 01 14:14:01 crc kubenswrapper[4913]: I1001 14:14:01.959870 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:01.990041 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwddd" Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:01.990289 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwddd" event={"ID":"ab45de02-573f-432e-9105-eb56f9b08e1f","Type":"ContainerDied","Data":"2960e8e3d50bc464053793c0f420d6e4232525ca9fe7c25afa656dd67770dc4f"} Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:01.990322 4913 scope.go:117] "RemoveContainer" containerID="6caa2a2b41359e9b221e24c3aa727f821ccba07c636d5d45cfe51bea2cec442f" Oct 01 14:14:02 crc kubenswrapper[4913]: E1001 14:14:01.991014 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tobiko-tests-tobiko\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tobiko:current-podified\\\"\"" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" podUID="567cfe78-d6e0-4a12-99d7-a280aeb55e68" Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:02.017126 4913 scope.go:117] "RemoveContainer" containerID="5111877ea8543af2fc4d712930e4bf20c3c05308a70ba6c088d246a054ea4773" Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:02.043357 4913 scope.go:117] "RemoveContainer" containerID="e9d6296cd933a1ad779257cd9cc04fc0c45b653db662ba1ae063b3c50bab4b8c" Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:02.073645 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8c59\" (UniqueName: \"kubernetes.io/projected/ab45de02-573f-432e-9105-eb56f9b08e1f-kube-api-access-n8c59\") pod \"ab45de02-573f-432e-9105-eb56f9b08e1f\" (UID: \"ab45de02-573f-432e-9105-eb56f9b08e1f\") " Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:02.074112 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab45de02-573f-432e-9105-eb56f9b08e1f-catalog-content\") pod \"ab45de02-573f-432e-9105-eb56f9b08e1f\" (UID: \"ab45de02-573f-432e-9105-eb56f9b08e1f\") " Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:02.074208 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab45de02-573f-432e-9105-eb56f9b08e1f-utilities\") pod \"ab45de02-573f-432e-9105-eb56f9b08e1f\" (UID: \"ab45de02-573f-432e-9105-eb56f9b08e1f\") " Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:02.075931 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab45de02-573f-432e-9105-eb56f9b08e1f-utilities" (OuterVolumeSpecName: "utilities") pod "ab45de02-573f-432e-9105-eb56f9b08e1f" (UID: "ab45de02-573f-432e-9105-eb56f9b08e1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:02.083180 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab45de02-573f-432e-9105-eb56f9b08e1f-kube-api-access-n8c59" (OuterVolumeSpecName: "kube-api-access-n8c59") pod "ab45de02-573f-432e-9105-eb56f9b08e1f" (UID: "ab45de02-573f-432e-9105-eb56f9b08e1f"). InnerVolumeSpecName "kube-api-access-n8c59". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:02.125493 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab45de02-573f-432e-9105-eb56f9b08e1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab45de02-573f-432e-9105-eb56f9b08e1f" (UID: "ab45de02-573f-432e-9105-eb56f9b08e1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:02.176733 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab45de02-573f-432e-9105-eb56f9b08e1f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:02.176783 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab45de02-573f-432e-9105-eb56f9b08e1f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:02.176804 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8c59\" (UniqueName: \"kubernetes.io/projected/ab45de02-573f-432e-9105-eb56f9b08e1f-kube-api-access-n8c59\") on node \"crc\" DevicePath \"\"" Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:02.332385 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hwddd"] Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:02.341029 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hwddd"] Oct 01 14:14:02 crc kubenswrapper[4913]: I1001 14:14:02.817862 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab45de02-573f-432e-9105-eb56f9b08e1f" path="/var/lib/kubelet/pods/ab45de02-573f-432e-9105-eb56f9b08e1f/volumes" Oct 01 14:14:06 crc kubenswrapper[4913]: I1001 14:14:06.807389 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:14:06 crc kubenswrapper[4913]: E1001 14:14:06.808079 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:14:18 crc kubenswrapper[4913]: I1001 14:14:18.151079 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"567cfe78-d6e0-4a12-99d7-a280aeb55e68","Type":"ContainerStarted","Data":"865ad118e7696cfc40b30a09f2e3801c094660948457631ae61b1c5f2b694130"} Oct 01 14:14:18 crc kubenswrapper[4913]: I1001 14:14:18.173699 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" podStartSLOduration=2.858028483 podStartE2EDuration="55.173678746s" podCreationTimestamp="2025-10-01 14:13:23 +0000 UTC" firstStartedPulling="2025-10-01 14:13:24.921488668 +0000 UTC m=+5736.824964246" lastFinishedPulling="2025-10-01 14:14:17.237138931 +0000 UTC m=+5789.140614509" observedRunningTime="2025-10-01 14:14:18.173092961 +0000 UTC m=+5790.076568559" watchObservedRunningTime="2025-10-01 14:14:18.173678746 +0000 UTC m=+5790.077154324" Oct 01 14:14:20 crc kubenswrapper[4913]: I1001 14:14:20.809907 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:14:20 crc kubenswrapper[4913]: E1001 14:14:20.810881 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:14:35 crc kubenswrapper[4913]: I1001 14:14:35.807149 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:14:35 crc kubenswrapper[4913]: E1001 14:14:35.807991 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:14:46 crc kubenswrapper[4913]: I1001 14:14:46.807105 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:14:46 crc kubenswrapper[4913]: E1001 14:14:46.807960 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:14:57 crc kubenswrapper[4913]: I1001 14:14:57.806754 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:14:57 crc kubenswrapper[4913]: E1001 14:14:57.807562 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.157455 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb"] Oct 01 14:15:00 crc kubenswrapper[4913]: E1001 14:15:00.158239 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab45de02-573f-432e-9105-eb56f9b08e1f" containerName="extract-utilities" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.158255 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab45de02-573f-432e-9105-eb56f9b08e1f" containerName="extract-utilities" Oct 01 14:15:00 crc kubenswrapper[4913]: E1001 14:15:00.158294 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab45de02-573f-432e-9105-eb56f9b08e1f" containerName="registry-server" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.158303 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab45de02-573f-432e-9105-eb56f9b08e1f" containerName="registry-server" Oct 01 14:15:00 crc kubenswrapper[4913]: E1001 14:15:00.158321 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab45de02-573f-432e-9105-eb56f9b08e1f" containerName="extract-content" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.158331 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab45de02-573f-432e-9105-eb56f9b08e1f" containerName="extract-content" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.158586 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab45de02-573f-432e-9105-eb56f9b08e1f" containerName="registry-server" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.163955 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.170323 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.170667 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.202292 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb"] Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.345990 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-secret-volume\") pod \"collect-profiles-29322135-gtkdb\" (UID: \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.346302 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75v6x\" (UniqueName: \"kubernetes.io/projected/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-kube-api-access-75v6x\") pod \"collect-profiles-29322135-gtkdb\" (UID: \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.346434 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-config-volume\") pod \"collect-profiles-29322135-gtkdb\" (UID: \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.448250 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-secret-volume\") pod \"collect-profiles-29322135-gtkdb\" (UID: \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.448366 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75v6x\" (UniqueName: \"kubernetes.io/projected/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-kube-api-access-75v6x\") pod \"collect-profiles-29322135-gtkdb\" (UID: \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.448468 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-config-volume\") pod \"collect-profiles-29322135-gtkdb\" (UID: \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.449531 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-config-volume\") pod \"collect-profiles-29322135-gtkdb\" (UID: \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.453750 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-secret-volume\") pod \"collect-profiles-29322135-gtkdb\" (UID: \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.476445 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75v6x\" (UniqueName: \"kubernetes.io/projected/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-kube-api-access-75v6x\") pod \"collect-profiles-29322135-gtkdb\" (UID: \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.485005 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" Oct 01 14:15:00 crc kubenswrapper[4913]: I1001 14:15:00.973468 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb"] Oct 01 14:15:01 crc kubenswrapper[4913]: I1001 14:15:01.518974 4913 generic.go:334] "Generic (PLEG): container finished" podID="4c454f89-716c-42c7-ade0-8b8bcdf1ff6a" containerID="5bda272535c851d0aa36faac8657947406fe8d41e2b0cc1774ca93df21f0567b" exitCode=0 Oct 01 14:15:01 crc kubenswrapper[4913]: I1001 14:15:01.519254 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" event={"ID":"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a","Type":"ContainerDied","Data":"5bda272535c851d0aa36faac8657947406fe8d41e2b0cc1774ca93df21f0567b"} Oct 01 14:15:01 crc kubenswrapper[4913]: I1001 14:15:01.519297 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" event={"ID":"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a","Type":"ContainerStarted","Data":"a462c160a44ea2c949af082e1ef95ccb4903dea85bfd1447425cec1c0e5c935f"} Oct 01 14:15:02 crc kubenswrapper[4913]: I1001 14:15:02.881100 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" Oct 01 14:15:02 crc kubenswrapper[4913]: I1001 14:15:02.993936 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-config-volume\") pod \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\" (UID: \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\") " Oct 01 14:15:02 crc kubenswrapper[4913]: I1001 14:15:02.994115 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-secret-volume\") pod \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\" (UID: \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\") " Oct 01 14:15:02 crc kubenswrapper[4913]: I1001 14:15:02.994288 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75v6x\" (UniqueName: \"kubernetes.io/projected/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-kube-api-access-75v6x\") pod \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\" (UID: \"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a\") " Oct 01 14:15:02 crc kubenswrapper[4913]: I1001 14:15:02.994791 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-config-volume" (OuterVolumeSpecName: "config-volume") pod "4c454f89-716c-42c7-ade0-8b8bcdf1ff6a" (UID: "4c454f89-716c-42c7-ade0-8b8bcdf1ff6a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:15:02 crc kubenswrapper[4913]: I1001 14:15:02.995339 4913 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:03 crc kubenswrapper[4913]: I1001 14:15:03.000398 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4c454f89-716c-42c7-ade0-8b8bcdf1ff6a" (UID: "4c454f89-716c-42c7-ade0-8b8bcdf1ff6a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:15:03 crc kubenswrapper[4913]: I1001 14:15:03.000472 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-kube-api-access-75v6x" (OuterVolumeSpecName: "kube-api-access-75v6x") pod "4c454f89-716c-42c7-ade0-8b8bcdf1ff6a" (UID: "4c454f89-716c-42c7-ade0-8b8bcdf1ff6a"). InnerVolumeSpecName "kube-api-access-75v6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:15:03 crc kubenswrapper[4913]: I1001 14:15:03.097403 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75v6x\" (UniqueName: \"kubernetes.io/projected/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-kube-api-access-75v6x\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:03 crc kubenswrapper[4913]: I1001 14:15:03.097442 4913 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c454f89-716c-42c7-ade0-8b8bcdf1ff6a-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:03 crc kubenswrapper[4913]: I1001 14:15:03.536335 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" event={"ID":"4c454f89-716c-42c7-ade0-8b8bcdf1ff6a","Type":"ContainerDied","Data":"a462c160a44ea2c949af082e1ef95ccb4903dea85bfd1447425cec1c0e5c935f"} Oct 01 14:15:03 crc kubenswrapper[4913]: I1001 14:15:03.536773 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a462c160a44ea2c949af082e1ef95ccb4903dea85bfd1447425cec1c0e5c935f" Oct 01 14:15:03 crc kubenswrapper[4913]: I1001 14:15:03.536385 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-gtkdb" Oct 01 14:15:03 crc kubenswrapper[4913]: I1001 14:15:03.988179 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb"] Oct 01 14:15:03 crc kubenswrapper[4913]: I1001 14:15:03.995425 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-r7skb"] Oct 01 14:15:04 crc kubenswrapper[4913]: I1001 14:15:04.816945 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9469dd7-5611-48f4-868d-1a36066f43d0" path="/var/lib/kubelet/pods/c9469dd7-5611-48f4-868d-1a36066f43d0/volumes" Oct 01 14:15:09 crc kubenswrapper[4913]: I1001 14:15:09.016504 4913 scope.go:117] "RemoveContainer" containerID="06be0009edb44b7f12cdcf42d446a82ebf846b5aebe201febcee683d247033e5" Oct 01 14:15:09 crc kubenswrapper[4913]: I1001 14:15:09.807386 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:15:09 crc kubenswrapper[4913]: E1001 14:15:09.807884 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:15:22 crc kubenswrapper[4913]: I1001 14:15:22.807662 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:15:22 crc kubenswrapper[4913]: E1001 14:15:22.808525 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:15:31 crc kubenswrapper[4913]: I1001 14:15:31.775809 4913 generic.go:334] "Generic (PLEG): container finished" podID="567cfe78-d6e0-4a12-99d7-a280aeb55e68" containerID="865ad118e7696cfc40b30a09f2e3801c094660948457631ae61b1c5f2b694130" exitCode=0 Oct 01 14:15:31 crc kubenswrapper[4913]: I1001 14:15:31.775990 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"567cfe78-d6e0-4a12-99d7-a280aeb55e68","Type":"ContainerDied","Data":"865ad118e7696cfc40b30a09f2e3801c094660948457631ae61b1c5f2b694130"} Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.163296 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.248610 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Oct 01 14:15:33 crc kubenswrapper[4913]: E1001 14:15:33.249023 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="567cfe78-d6e0-4a12-99d7-a280aeb55e68" containerName="tobiko-tests-tobiko" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.249042 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="567cfe78-d6e0-4a12-99d7-a280aeb55e68" containerName="tobiko-tests-tobiko" Oct 01 14:15:33 crc kubenswrapper[4913]: E1001 14:15:33.249053 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c454f89-716c-42c7-ade0-8b8bcdf1ff6a" containerName="collect-profiles" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.249060 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c454f89-716c-42c7-ade0-8b8bcdf1ff6a" containerName="collect-profiles" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.249279 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="567cfe78-d6e0-4a12-99d7-a280aeb55e68" containerName="tobiko-tests-tobiko" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.249318 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c454f89-716c-42c7-ade0-8b8bcdf1ff6a" containerName="collect-profiles" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.250137 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.289422 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-clouds-config\") pod \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.289487 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-ceph\") pod \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.289509 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-ephemeral-workdir\") pod \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.290542 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-ephemeral-temporary\") pod \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.290588 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-ca-certs\") pod \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.290641 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-private-key\") pod \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.290676 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-openstack-config-secret\") pod \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.290710 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fz2x\" (UniqueName: \"kubernetes.io/projected/567cfe78-d6e0-4a12-99d7-a280aeb55e68-kube-api-access-7fz2x\") pod \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.290732 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-kubeconfig\") pod \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.290765 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-public-key\") pod \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.290803 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-config\") pod \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.290836 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\" (UID: \"567cfe78-d6e0-4a12-99d7-a280aeb55e68\") " Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.291173 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.291218 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.291303 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzfz2\" (UniqueName: \"kubernetes.io/projected/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-kube-api-access-bzfz2\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.291347 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.291388 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.291418 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.291447 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.291468 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.291493 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.291537 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.291582 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.293379 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "567cfe78-d6e0-4a12-99d7-a280aeb55e68" (UID: "567cfe78-d6e0-4a12-99d7-a280aeb55e68"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.305081 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "567cfe78-d6e0-4a12-99d7-a280aeb55e68" (UID: "567cfe78-d6e0-4a12-99d7-a280aeb55e68"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.305170 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/567cfe78-d6e0-4a12-99d7-a280aeb55e68-kube-api-access-7fz2x" (OuterVolumeSpecName: "kube-api-access-7fz2x") pod "567cfe78-d6e0-4a12-99d7-a280aeb55e68" (UID: "567cfe78-d6e0-4a12-99d7-a280aeb55e68"). InnerVolumeSpecName "kube-api-access-7fz2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.306753 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.316424 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-ceph" (OuterVolumeSpecName: "ceph") pod "567cfe78-d6e0-4a12-99d7-a280aeb55e68" (UID: "567cfe78-d6e0-4a12-99d7-a280aeb55e68"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.327297 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-public-key" (OuterVolumeSpecName: "tobiko-public-key") pod "567cfe78-d6e0-4a12-99d7-a280aeb55e68" (UID: "567cfe78-d6e0-4a12-99d7-a280aeb55e68"). InnerVolumeSpecName "tobiko-public-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.328018 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-config" (OuterVolumeSpecName: "tobiko-config") pod "567cfe78-d6e0-4a12-99d7-a280aeb55e68" (UID: "567cfe78-d6e0-4a12-99d7-a280aeb55e68"). InnerVolumeSpecName "tobiko-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.336148 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-private-key" (OuterVolumeSpecName: "tobiko-private-key") pod "567cfe78-d6e0-4a12-99d7-a280aeb55e68" (UID: "567cfe78-d6e0-4a12-99d7-a280aeb55e68"). InnerVolumeSpecName "tobiko-private-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.339538 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "567cfe78-d6e0-4a12-99d7-a280aeb55e68" (UID: "567cfe78-d6e0-4a12-99d7-a280aeb55e68"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.345312 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-kubeconfig" (OuterVolumeSpecName: "kubeconfig") pod "567cfe78-d6e0-4a12-99d7-a280aeb55e68" (UID: "567cfe78-d6e0-4a12-99d7-a280aeb55e68"). InnerVolumeSpecName "kubeconfig". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.360157 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "567cfe78-d6e0-4a12-99d7-a280aeb55e68" (UID: "567cfe78-d6e0-4a12-99d7-a280aeb55e68"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.362595 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "567cfe78-d6e0-4a12-99d7-a280aeb55e68" (UID: "567cfe78-d6e0-4a12-99d7-a280aeb55e68"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.393221 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.393606 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.393742 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzfz2\" (UniqueName: \"kubernetes.io/projected/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-kube-api-access-bzfz2\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.394204 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.394343 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.394455 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.394253 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.394632 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.394720 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.394825 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.395895 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.396046 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.396349 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.395917 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.396444 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.395787 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.397099 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.396577 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.397356 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.397481 4913 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.397711 4913 reconciler_common.go:293] "Volume detached for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-private-key\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.397812 4913 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.397909 4913 reconciler_common.go:293] "Volume detached for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/567cfe78-d6e0-4a12-99d7-a280aeb55e68-kubeconfig\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.398040 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fz2x\" (UniqueName: \"kubernetes.io/projected/567cfe78-d6e0-4a12-99d7-a280aeb55e68-kube-api-access-7fz2x\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.398142 4913 reconciler_common.go:293] "Volume detached for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-public-key\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.398245 4913 reconciler_common.go:293] "Volume detached for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-tobiko-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.398362 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.397580 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.398091 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.398603 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.400745 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.402470 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.409147 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzfz2\" (UniqueName: \"kubernetes.io/projected/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-kube-api-access-bzfz2\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.423492 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.585185 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.794935 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"567cfe78-d6e0-4a12-99d7-a280aeb55e68","Type":"ContainerDied","Data":"dd5e0779a649969f08d197bf11a51b8fa2e5222b0b705d07d732576da3d0785d"} Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.795052 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd5e0779a649969f08d197bf11a51b8fa2e5222b0b705d07d732576da3d0785d" Oct 01 14:15:33 crc kubenswrapper[4913]: I1001 14:15:33.795008 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Oct 01 14:15:34 crc kubenswrapper[4913]: I1001 14:15:34.137473 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Oct 01 14:15:34 crc kubenswrapper[4913]: I1001 14:15:34.864590 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d","Type":"ContainerStarted","Data":"6deb53370cc0478b36ca7770315906d9bc6e3b2c7b0657ee641718f04e3996c2"} Oct 01 14:15:34 crc kubenswrapper[4913]: I1001 14:15:34.962950 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "567cfe78-d6e0-4a12-99d7-a280aeb55e68" (UID: "567cfe78-d6e0-4a12-99d7-a280aeb55e68"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:15:35 crc kubenswrapper[4913]: I1001 14:15:35.051635 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/567cfe78-d6e0-4a12-99d7-a280aeb55e68-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:35 crc kubenswrapper[4913]: I1001 14:15:35.838738 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d","Type":"ContainerStarted","Data":"93b2ae5e908c63e26b3251a149a0cce630b7493601cffd5bb30eada0a5efb09a"} Oct 01 14:15:35 crc kubenswrapper[4913]: I1001 14:15:35.889525 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tobiko-tests-tobiko-s01-sanity" podStartSLOduration=2.889500049 podStartE2EDuration="2.889500049s" podCreationTimestamp="2025-10-01 14:15:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:15:35.871748724 +0000 UTC m=+5867.775224312" watchObservedRunningTime="2025-10-01 14:15:35.889500049 +0000 UTC m=+5867.792975627" Oct 01 14:15:36 crc kubenswrapper[4913]: I1001 14:15:36.808440 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:15:36 crc kubenswrapper[4913]: E1001 14:15:36.809080 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:15:47 crc kubenswrapper[4913]: I1001 14:15:47.806778 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:15:47 crc kubenswrapper[4913]: E1001 14:15:47.808015 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.192436 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qld67"] Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.195040 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qld67" Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.216718 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qld67"] Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.251561 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c87bd280-9ffb-4f32-b64a-ce359e6965a5-utilities\") pod \"community-operators-qld67\" (UID: \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\") " pod="openshift-marketplace/community-operators-qld67" Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.251781 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c87bd280-9ffb-4f32-b64a-ce359e6965a5-catalog-content\") pod \"community-operators-qld67\" (UID: \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\") " pod="openshift-marketplace/community-operators-qld67" Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.251887 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5bvz\" (UniqueName: \"kubernetes.io/projected/c87bd280-9ffb-4f32-b64a-ce359e6965a5-kube-api-access-w5bvz\") pod \"community-operators-qld67\" (UID: \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\") " pod="openshift-marketplace/community-operators-qld67" Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.353335 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5bvz\" (UniqueName: \"kubernetes.io/projected/c87bd280-9ffb-4f32-b64a-ce359e6965a5-kube-api-access-w5bvz\") pod \"community-operators-qld67\" (UID: \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\") " pod="openshift-marketplace/community-operators-qld67" Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.353400 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c87bd280-9ffb-4f32-b64a-ce359e6965a5-utilities\") pod \"community-operators-qld67\" (UID: \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\") " pod="openshift-marketplace/community-operators-qld67" Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.353507 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c87bd280-9ffb-4f32-b64a-ce359e6965a5-catalog-content\") pod \"community-operators-qld67\" (UID: \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\") " pod="openshift-marketplace/community-operators-qld67" Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.353939 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c87bd280-9ffb-4f32-b64a-ce359e6965a5-catalog-content\") pod \"community-operators-qld67\" (UID: \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\") " pod="openshift-marketplace/community-operators-qld67" Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.354446 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c87bd280-9ffb-4f32-b64a-ce359e6965a5-utilities\") pod \"community-operators-qld67\" (UID: \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\") " pod="openshift-marketplace/community-operators-qld67" Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.384707 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5bvz\" (UniqueName: \"kubernetes.io/projected/c87bd280-9ffb-4f32-b64a-ce359e6965a5-kube-api-access-w5bvz\") pod \"community-operators-qld67\" (UID: \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\") " pod="openshift-marketplace/community-operators-qld67" Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.515853 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qld67" Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.841600 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qld67"] Oct 01 14:15:50 crc kubenswrapper[4913]: I1001 14:15:50.979034 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qld67" event={"ID":"c87bd280-9ffb-4f32-b64a-ce359e6965a5","Type":"ContainerStarted","Data":"c987f19a3d6484e1dca35a1c7ed0a9197513bfa4e9c88774cfde78108bd6eb73"} Oct 01 14:15:51 crc kubenswrapper[4913]: I1001 14:15:51.992199 4913 generic.go:334] "Generic (PLEG): container finished" podID="c87bd280-9ffb-4f32-b64a-ce359e6965a5" containerID="e6edf55744f78657c3736e0fba8362b24eef32b82cde6147079fa16dbb5c4cd3" exitCode=0 Oct 01 14:15:51 crc kubenswrapper[4913]: I1001 14:15:51.992503 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qld67" event={"ID":"c87bd280-9ffb-4f32-b64a-ce359e6965a5","Type":"ContainerDied","Data":"e6edf55744f78657c3736e0fba8362b24eef32b82cde6147079fa16dbb5c4cd3"} Oct 01 14:15:53 crc kubenswrapper[4913]: I1001 14:15:53.005126 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qld67" event={"ID":"c87bd280-9ffb-4f32-b64a-ce359e6965a5","Type":"ContainerStarted","Data":"2d2b50d19774f9d464fd48d36db4991364a855928f29067a33615a894aa3472e"} Oct 01 14:15:53 crc kubenswrapper[4913]: I1001 14:15:53.375305 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bjt48"] Oct 01 14:15:53 crc kubenswrapper[4913]: I1001 14:15:53.377972 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:15:53 crc kubenswrapper[4913]: I1001 14:15:53.395527 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjt48"] Oct 01 14:15:53 crc kubenswrapper[4913]: I1001 14:15:53.422600 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-catalog-content\") pod \"redhat-marketplace-bjt48\" (UID: \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\") " pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:15:53 crc kubenswrapper[4913]: I1001 14:15:53.422678 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djcr8\" (UniqueName: \"kubernetes.io/projected/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-kube-api-access-djcr8\") pod \"redhat-marketplace-bjt48\" (UID: \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\") " pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:15:53 crc kubenswrapper[4913]: I1001 14:15:53.422824 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-utilities\") pod \"redhat-marketplace-bjt48\" (UID: \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\") " pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:15:53 crc kubenswrapper[4913]: I1001 14:15:53.523535 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djcr8\" (UniqueName: \"kubernetes.io/projected/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-kube-api-access-djcr8\") pod \"redhat-marketplace-bjt48\" (UID: \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\") " pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:15:53 crc kubenswrapper[4913]: I1001 14:15:53.523682 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-utilities\") pod \"redhat-marketplace-bjt48\" (UID: \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\") " pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:15:53 crc kubenswrapper[4913]: I1001 14:15:53.523730 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-catalog-content\") pod \"redhat-marketplace-bjt48\" (UID: \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\") " pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:15:53 crc kubenswrapper[4913]: I1001 14:15:53.524314 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-catalog-content\") pod \"redhat-marketplace-bjt48\" (UID: \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\") " pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:15:53 crc kubenswrapper[4913]: I1001 14:15:53.524334 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-utilities\") pod \"redhat-marketplace-bjt48\" (UID: \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\") " pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:15:53 crc kubenswrapper[4913]: I1001 14:15:53.545077 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djcr8\" (UniqueName: \"kubernetes.io/projected/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-kube-api-access-djcr8\") pod \"redhat-marketplace-bjt48\" (UID: \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\") " pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:15:53 crc kubenswrapper[4913]: I1001 14:15:53.698655 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:15:54 crc kubenswrapper[4913]: I1001 14:15:54.015363 4913 generic.go:334] "Generic (PLEG): container finished" podID="c87bd280-9ffb-4f32-b64a-ce359e6965a5" containerID="2d2b50d19774f9d464fd48d36db4991364a855928f29067a33615a894aa3472e" exitCode=0 Oct 01 14:15:54 crc kubenswrapper[4913]: I1001 14:15:54.015476 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qld67" event={"ID":"c87bd280-9ffb-4f32-b64a-ce359e6965a5","Type":"ContainerDied","Data":"2d2b50d19774f9d464fd48d36db4991364a855928f29067a33615a894aa3472e"} Oct 01 14:15:54 crc kubenswrapper[4913]: I1001 14:15:54.216018 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjt48"] Oct 01 14:15:55 crc kubenswrapper[4913]: I1001 14:15:55.024581 4913 generic.go:334] "Generic (PLEG): container finished" podID="4a95f32e-2336-4f26-8e9c-c7ac2fb56223" containerID="09051fca563c5eb30be194bdf092586f7ad3ca08cf68ee298c0750b560cce624" exitCode=0 Oct 01 14:15:55 crc kubenswrapper[4913]: I1001 14:15:55.024681 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjt48" event={"ID":"4a95f32e-2336-4f26-8e9c-c7ac2fb56223","Type":"ContainerDied","Data":"09051fca563c5eb30be194bdf092586f7ad3ca08cf68ee298c0750b560cce624"} Oct 01 14:15:55 crc kubenswrapper[4913]: I1001 14:15:55.024983 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjt48" event={"ID":"4a95f32e-2336-4f26-8e9c-c7ac2fb56223","Type":"ContainerStarted","Data":"762a661e9586c28fefb346893a2e326c419506aa8ab27b02b5113304ab26c6a8"} Oct 01 14:15:55 crc kubenswrapper[4913]: I1001 14:15:55.027598 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qld67" event={"ID":"c87bd280-9ffb-4f32-b64a-ce359e6965a5","Type":"ContainerStarted","Data":"88ed0c5d967ce530c04f163fde71c2216a5040332a89a3b1ff8003bf682e9a2c"} Oct 01 14:15:55 crc kubenswrapper[4913]: I1001 14:15:55.069191 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qld67" podStartSLOduration=2.644515509 podStartE2EDuration="5.069175882s" podCreationTimestamp="2025-10-01 14:15:50 +0000 UTC" firstStartedPulling="2025-10-01 14:15:51.994745869 +0000 UTC m=+5883.898221447" lastFinishedPulling="2025-10-01 14:15:54.419406242 +0000 UTC m=+5886.322881820" observedRunningTime="2025-10-01 14:15:55.065788413 +0000 UTC m=+5886.969264011" watchObservedRunningTime="2025-10-01 14:15:55.069175882 +0000 UTC m=+5886.972651460" Oct 01 14:15:57 crc kubenswrapper[4913]: I1001 14:15:57.050983 4913 generic.go:334] "Generic (PLEG): container finished" podID="4a95f32e-2336-4f26-8e9c-c7ac2fb56223" containerID="1f90d9e22b6224bf14bdb3ddeb6b8ca49f2a21471f2679d33362b5610b94288a" exitCode=0 Oct 01 14:15:57 crc kubenswrapper[4913]: I1001 14:15:57.051060 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjt48" event={"ID":"4a95f32e-2336-4f26-8e9c-c7ac2fb56223","Type":"ContainerDied","Data":"1f90d9e22b6224bf14bdb3ddeb6b8ca49f2a21471f2679d33362b5610b94288a"} Oct 01 14:15:58 crc kubenswrapper[4913]: I1001 14:15:58.067416 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjt48" event={"ID":"4a95f32e-2336-4f26-8e9c-c7ac2fb56223","Type":"ContainerStarted","Data":"1f93b7fe556cc075a3fa87816f44faf06009bb8a4c5164f85f7cd2c3028047fc"} Oct 01 14:15:58 crc kubenswrapper[4913]: I1001 14:15:58.106517 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bjt48" podStartSLOduration=2.520421612 podStartE2EDuration="5.106494823s" podCreationTimestamp="2025-10-01 14:15:53 +0000 UTC" firstStartedPulling="2025-10-01 14:15:55.026391729 +0000 UTC m=+5886.929867307" lastFinishedPulling="2025-10-01 14:15:57.61246494 +0000 UTC m=+5889.515940518" observedRunningTime="2025-10-01 14:15:58.096915591 +0000 UTC m=+5890.000391189" watchObservedRunningTime="2025-10-01 14:15:58.106494823 +0000 UTC m=+5890.009970411" Oct 01 14:15:58 crc kubenswrapper[4913]: I1001 14:15:58.813418 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:15:58 crc kubenswrapper[4913]: E1001 14:15:58.813781 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:16:00 crc kubenswrapper[4913]: I1001 14:16:00.516279 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qld67" Oct 01 14:16:00 crc kubenswrapper[4913]: I1001 14:16:00.516660 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qld67" Oct 01 14:16:00 crc kubenswrapper[4913]: I1001 14:16:00.569575 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qld67" Oct 01 14:16:01 crc kubenswrapper[4913]: I1001 14:16:01.132882 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qld67" Oct 01 14:16:01 crc kubenswrapper[4913]: I1001 14:16:01.770135 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qld67"] Oct 01 14:16:03 crc kubenswrapper[4913]: I1001 14:16:03.104137 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qld67" podUID="c87bd280-9ffb-4f32-b64a-ce359e6965a5" containerName="registry-server" containerID="cri-o://88ed0c5d967ce530c04f163fde71c2216a5040332a89a3b1ff8003bf682e9a2c" gracePeriod=2 Oct 01 14:16:03 crc kubenswrapper[4913]: I1001 14:16:03.562351 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qld67" Oct 01 14:16:03 crc kubenswrapper[4913]: I1001 14:16:03.623948 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c87bd280-9ffb-4f32-b64a-ce359e6965a5-utilities\") pod \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\" (UID: \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\") " Oct 01 14:16:03 crc kubenswrapper[4913]: I1001 14:16:03.624042 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c87bd280-9ffb-4f32-b64a-ce359e6965a5-catalog-content\") pod \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\" (UID: \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\") " Oct 01 14:16:03 crc kubenswrapper[4913]: I1001 14:16:03.624081 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5bvz\" (UniqueName: \"kubernetes.io/projected/c87bd280-9ffb-4f32-b64a-ce359e6965a5-kube-api-access-w5bvz\") pod \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\" (UID: \"c87bd280-9ffb-4f32-b64a-ce359e6965a5\") " Oct 01 14:16:03 crc kubenswrapper[4913]: I1001 14:16:03.624845 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c87bd280-9ffb-4f32-b64a-ce359e6965a5-utilities" (OuterVolumeSpecName: "utilities") pod "c87bd280-9ffb-4f32-b64a-ce359e6965a5" (UID: "c87bd280-9ffb-4f32-b64a-ce359e6965a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:16:03 crc kubenswrapper[4913]: I1001 14:16:03.630332 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c87bd280-9ffb-4f32-b64a-ce359e6965a5-kube-api-access-w5bvz" (OuterVolumeSpecName: "kube-api-access-w5bvz") pod "c87bd280-9ffb-4f32-b64a-ce359e6965a5" (UID: "c87bd280-9ffb-4f32-b64a-ce359e6965a5"). InnerVolumeSpecName "kube-api-access-w5bvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:16:03 crc kubenswrapper[4913]: I1001 14:16:03.699363 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:16:03 crc kubenswrapper[4913]: I1001 14:16:03.699645 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:16:03 crc kubenswrapper[4913]: I1001 14:16:03.728720 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5bvz\" (UniqueName: \"kubernetes.io/projected/c87bd280-9ffb-4f32-b64a-ce359e6965a5-kube-api-access-w5bvz\") on node \"crc\" DevicePath \"\"" Oct 01 14:16:03 crc kubenswrapper[4913]: I1001 14:16:03.728754 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c87bd280-9ffb-4f32-b64a-ce359e6965a5-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:16:03 crc kubenswrapper[4913]: I1001 14:16:03.747164 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c87bd280-9ffb-4f32-b64a-ce359e6965a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c87bd280-9ffb-4f32-b64a-ce359e6965a5" (UID: "c87bd280-9ffb-4f32-b64a-ce359e6965a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:16:03 crc kubenswrapper[4913]: I1001 14:16:03.748618 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:16:03 crc kubenswrapper[4913]: I1001 14:16:03.830008 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c87bd280-9ffb-4f32-b64a-ce359e6965a5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.113803 4913 generic.go:334] "Generic (PLEG): container finished" podID="c87bd280-9ffb-4f32-b64a-ce359e6965a5" containerID="88ed0c5d967ce530c04f163fde71c2216a5040332a89a3b1ff8003bf682e9a2c" exitCode=0 Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.113873 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qld67" Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.113884 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qld67" event={"ID":"c87bd280-9ffb-4f32-b64a-ce359e6965a5","Type":"ContainerDied","Data":"88ed0c5d967ce530c04f163fde71c2216a5040332a89a3b1ff8003bf682e9a2c"} Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.113931 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qld67" event={"ID":"c87bd280-9ffb-4f32-b64a-ce359e6965a5","Type":"ContainerDied","Data":"c987f19a3d6484e1dca35a1c7ed0a9197513bfa4e9c88774cfde78108bd6eb73"} Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.113957 4913 scope.go:117] "RemoveContainer" containerID="88ed0c5d967ce530c04f163fde71c2216a5040332a89a3b1ff8003bf682e9a2c" Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.151959 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qld67"] Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.152550 4913 scope.go:117] "RemoveContainer" containerID="2d2b50d19774f9d464fd48d36db4991364a855928f29067a33615a894aa3472e" Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.163132 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qld67"] Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.177649 4913 scope.go:117] "RemoveContainer" containerID="e6edf55744f78657c3736e0fba8362b24eef32b82cde6147079fa16dbb5c4cd3" Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.177712 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.217968 4913 scope.go:117] "RemoveContainer" containerID="88ed0c5d967ce530c04f163fde71c2216a5040332a89a3b1ff8003bf682e9a2c" Oct 01 14:16:04 crc kubenswrapper[4913]: E1001 14:16:04.218376 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88ed0c5d967ce530c04f163fde71c2216a5040332a89a3b1ff8003bf682e9a2c\": container with ID starting with 88ed0c5d967ce530c04f163fde71c2216a5040332a89a3b1ff8003bf682e9a2c not found: ID does not exist" containerID="88ed0c5d967ce530c04f163fde71c2216a5040332a89a3b1ff8003bf682e9a2c" Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.218424 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88ed0c5d967ce530c04f163fde71c2216a5040332a89a3b1ff8003bf682e9a2c"} err="failed to get container status \"88ed0c5d967ce530c04f163fde71c2216a5040332a89a3b1ff8003bf682e9a2c\": rpc error: code = NotFound desc = could not find container \"88ed0c5d967ce530c04f163fde71c2216a5040332a89a3b1ff8003bf682e9a2c\": container with ID starting with 88ed0c5d967ce530c04f163fde71c2216a5040332a89a3b1ff8003bf682e9a2c not found: ID does not exist" Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.218451 4913 scope.go:117] "RemoveContainer" containerID="2d2b50d19774f9d464fd48d36db4991364a855928f29067a33615a894aa3472e" Oct 01 14:16:04 crc kubenswrapper[4913]: E1001 14:16:04.219069 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d2b50d19774f9d464fd48d36db4991364a855928f29067a33615a894aa3472e\": container with ID starting with 2d2b50d19774f9d464fd48d36db4991364a855928f29067a33615a894aa3472e not found: ID does not exist" containerID="2d2b50d19774f9d464fd48d36db4991364a855928f29067a33615a894aa3472e" Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.219161 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d2b50d19774f9d464fd48d36db4991364a855928f29067a33615a894aa3472e"} err="failed to get container status \"2d2b50d19774f9d464fd48d36db4991364a855928f29067a33615a894aa3472e\": rpc error: code = NotFound desc = could not find container \"2d2b50d19774f9d464fd48d36db4991364a855928f29067a33615a894aa3472e\": container with ID starting with 2d2b50d19774f9d464fd48d36db4991364a855928f29067a33615a894aa3472e not found: ID does not exist" Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.219239 4913 scope.go:117] "RemoveContainer" containerID="e6edf55744f78657c3736e0fba8362b24eef32b82cde6147079fa16dbb5c4cd3" Oct 01 14:16:04 crc kubenswrapper[4913]: E1001 14:16:04.219595 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6edf55744f78657c3736e0fba8362b24eef32b82cde6147079fa16dbb5c4cd3\": container with ID starting with e6edf55744f78657c3736e0fba8362b24eef32b82cde6147079fa16dbb5c4cd3 not found: ID does not exist" containerID="e6edf55744f78657c3736e0fba8362b24eef32b82cde6147079fa16dbb5c4cd3" Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.219625 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6edf55744f78657c3736e0fba8362b24eef32b82cde6147079fa16dbb5c4cd3"} err="failed to get container status \"e6edf55744f78657c3736e0fba8362b24eef32b82cde6147079fa16dbb5c4cd3\": rpc error: code = NotFound desc = could not find container \"e6edf55744f78657c3736e0fba8362b24eef32b82cde6147079fa16dbb5c4cd3\": container with ID starting with e6edf55744f78657c3736e0fba8362b24eef32b82cde6147079fa16dbb5c4cd3 not found: ID does not exist" Oct 01 14:16:04 crc kubenswrapper[4913]: I1001 14:16:04.830834 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c87bd280-9ffb-4f32-b64a-ce359e6965a5" path="/var/lib/kubelet/pods/c87bd280-9ffb-4f32-b64a-ce359e6965a5/volumes" Oct 01 14:16:06 crc kubenswrapper[4913]: I1001 14:16:06.175571 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjt48"] Oct 01 14:16:07 crc kubenswrapper[4913]: I1001 14:16:07.143653 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bjt48" podUID="4a95f32e-2336-4f26-8e9c-c7ac2fb56223" containerName="registry-server" containerID="cri-o://1f93b7fe556cc075a3fa87816f44faf06009bb8a4c5164f85f7cd2c3028047fc" gracePeriod=2 Oct 01 14:16:07 crc kubenswrapper[4913]: I1001 14:16:07.640030 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:16:07 crc kubenswrapper[4913]: I1001 14:16:07.802764 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-catalog-content\") pod \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\" (UID: \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\") " Oct 01 14:16:07 crc kubenswrapper[4913]: I1001 14:16:07.802957 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-utilities\") pod \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\" (UID: \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\") " Oct 01 14:16:07 crc kubenswrapper[4913]: I1001 14:16:07.803088 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djcr8\" (UniqueName: \"kubernetes.io/projected/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-kube-api-access-djcr8\") pod \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\" (UID: \"4a95f32e-2336-4f26-8e9c-c7ac2fb56223\") " Oct 01 14:16:07 crc kubenswrapper[4913]: I1001 14:16:07.804506 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-utilities" (OuterVolumeSpecName: "utilities") pod "4a95f32e-2336-4f26-8e9c-c7ac2fb56223" (UID: "4a95f32e-2336-4f26-8e9c-c7ac2fb56223"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:16:07 crc kubenswrapper[4913]: I1001 14:16:07.813466 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-kube-api-access-djcr8" (OuterVolumeSpecName: "kube-api-access-djcr8") pod "4a95f32e-2336-4f26-8e9c-c7ac2fb56223" (UID: "4a95f32e-2336-4f26-8e9c-c7ac2fb56223"). InnerVolumeSpecName "kube-api-access-djcr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:16:07 crc kubenswrapper[4913]: I1001 14:16:07.821779 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a95f32e-2336-4f26-8e9c-c7ac2fb56223" (UID: "4a95f32e-2336-4f26-8e9c-c7ac2fb56223"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:16:07 crc kubenswrapper[4913]: I1001 14:16:07.905497 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djcr8\" (UniqueName: \"kubernetes.io/projected/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-kube-api-access-djcr8\") on node \"crc\" DevicePath \"\"" Oct 01 14:16:07 crc kubenswrapper[4913]: I1001 14:16:07.905526 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:16:07 crc kubenswrapper[4913]: I1001 14:16:07.905535 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a95f32e-2336-4f26-8e9c-c7ac2fb56223-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.153206 4913 generic.go:334] "Generic (PLEG): container finished" podID="4a95f32e-2336-4f26-8e9c-c7ac2fb56223" containerID="1f93b7fe556cc075a3fa87816f44faf06009bb8a4c5164f85f7cd2c3028047fc" exitCode=0 Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.153251 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjt48" event={"ID":"4a95f32e-2336-4f26-8e9c-c7ac2fb56223","Type":"ContainerDied","Data":"1f93b7fe556cc075a3fa87816f44faf06009bb8a4c5164f85f7cd2c3028047fc"} Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.153304 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjt48" event={"ID":"4a95f32e-2336-4f26-8e9c-c7ac2fb56223","Type":"ContainerDied","Data":"762a661e9586c28fefb346893a2e326c419506aa8ab27b02b5113304ab26c6a8"} Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.153321 4913 scope.go:117] "RemoveContainer" containerID="1f93b7fe556cc075a3fa87816f44faf06009bb8a4c5164f85f7cd2c3028047fc" Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.153326 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjt48" Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.172179 4913 scope.go:117] "RemoveContainer" containerID="1f90d9e22b6224bf14bdb3ddeb6b8ca49f2a21471f2679d33362b5610b94288a" Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.190220 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjt48"] Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.200060 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjt48"] Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.213707 4913 scope.go:117] "RemoveContainer" containerID="09051fca563c5eb30be194bdf092586f7ad3ca08cf68ee298c0750b560cce624" Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.253916 4913 scope.go:117] "RemoveContainer" containerID="1f93b7fe556cc075a3fa87816f44faf06009bb8a4c5164f85f7cd2c3028047fc" Oct 01 14:16:08 crc kubenswrapper[4913]: E1001 14:16:08.254303 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f93b7fe556cc075a3fa87816f44faf06009bb8a4c5164f85f7cd2c3028047fc\": container with ID starting with 1f93b7fe556cc075a3fa87816f44faf06009bb8a4c5164f85f7cd2c3028047fc not found: ID does not exist" containerID="1f93b7fe556cc075a3fa87816f44faf06009bb8a4c5164f85f7cd2c3028047fc" Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.254336 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f93b7fe556cc075a3fa87816f44faf06009bb8a4c5164f85f7cd2c3028047fc"} err="failed to get container status \"1f93b7fe556cc075a3fa87816f44faf06009bb8a4c5164f85f7cd2c3028047fc\": rpc error: code = NotFound desc = could not find container \"1f93b7fe556cc075a3fa87816f44faf06009bb8a4c5164f85f7cd2c3028047fc\": container with ID starting with 1f93b7fe556cc075a3fa87816f44faf06009bb8a4c5164f85f7cd2c3028047fc not found: ID does not exist" Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.254360 4913 scope.go:117] "RemoveContainer" containerID="1f90d9e22b6224bf14bdb3ddeb6b8ca49f2a21471f2679d33362b5610b94288a" Oct 01 14:16:08 crc kubenswrapper[4913]: E1001 14:16:08.254634 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f90d9e22b6224bf14bdb3ddeb6b8ca49f2a21471f2679d33362b5610b94288a\": container with ID starting with 1f90d9e22b6224bf14bdb3ddeb6b8ca49f2a21471f2679d33362b5610b94288a not found: ID does not exist" containerID="1f90d9e22b6224bf14bdb3ddeb6b8ca49f2a21471f2679d33362b5610b94288a" Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.254658 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f90d9e22b6224bf14bdb3ddeb6b8ca49f2a21471f2679d33362b5610b94288a"} err="failed to get container status \"1f90d9e22b6224bf14bdb3ddeb6b8ca49f2a21471f2679d33362b5610b94288a\": rpc error: code = NotFound desc = could not find container \"1f90d9e22b6224bf14bdb3ddeb6b8ca49f2a21471f2679d33362b5610b94288a\": container with ID starting with 1f90d9e22b6224bf14bdb3ddeb6b8ca49f2a21471f2679d33362b5610b94288a not found: ID does not exist" Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.254676 4913 scope.go:117] "RemoveContainer" containerID="09051fca563c5eb30be194bdf092586f7ad3ca08cf68ee298c0750b560cce624" Oct 01 14:16:08 crc kubenswrapper[4913]: E1001 14:16:08.254990 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09051fca563c5eb30be194bdf092586f7ad3ca08cf68ee298c0750b560cce624\": container with ID starting with 09051fca563c5eb30be194bdf092586f7ad3ca08cf68ee298c0750b560cce624 not found: ID does not exist" containerID="09051fca563c5eb30be194bdf092586f7ad3ca08cf68ee298c0750b560cce624" Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.255026 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09051fca563c5eb30be194bdf092586f7ad3ca08cf68ee298c0750b560cce624"} err="failed to get container status \"09051fca563c5eb30be194bdf092586f7ad3ca08cf68ee298c0750b560cce624\": rpc error: code = NotFound desc = could not find container \"09051fca563c5eb30be194bdf092586f7ad3ca08cf68ee298c0750b560cce624\": container with ID starting with 09051fca563c5eb30be194bdf092586f7ad3ca08cf68ee298c0750b560cce624 not found: ID does not exist" Oct 01 14:16:08 crc kubenswrapper[4913]: I1001 14:16:08.817718 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a95f32e-2336-4f26-8e9c-c7ac2fb56223" path="/var/lib/kubelet/pods/4a95f32e-2336-4f26-8e9c-c7ac2fb56223/volumes" Oct 01 14:16:09 crc kubenswrapper[4913]: I1001 14:16:09.807016 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:16:09 crc kubenswrapper[4913]: E1001 14:16:09.807592 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:16:24 crc kubenswrapper[4913]: I1001 14:16:24.807020 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:16:24 crc kubenswrapper[4913]: E1001 14:16:24.807981 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:16:35 crc kubenswrapper[4913]: I1001 14:16:35.807671 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:16:35 crc kubenswrapper[4913]: E1001 14:16:35.810777 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:16:46 crc kubenswrapper[4913]: I1001 14:16:46.806936 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:16:47 crc kubenswrapper[4913]: I1001 14:16:47.487354 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"504a2ad5746fe3c913dede622c0d007e14a82d5305df77d85d6a4fe2920436de"} Oct 01 14:17:10 crc kubenswrapper[4913]: I1001 14:17:10.705688 4913 generic.go:334] "Generic (PLEG): container finished" podID="d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" containerID="93b2ae5e908c63e26b3251a149a0cce630b7493601cffd5bb30eada0a5efb09a" exitCode=0 Oct 01 14:17:10 crc kubenswrapper[4913]: I1001 14:17:10.707161 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d","Type":"ContainerDied","Data":"93b2ae5e908c63e26b3251a149a0cce630b7493601cffd5bb30eada0a5efb09a"} Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.179666 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.373471 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzfz2\" (UniqueName: \"kubernetes.io/projected/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-kube-api-access-bzfz2\") pod \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.373505 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-ca-certs\") pod \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.373529 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-config\") pod \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.373551 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-openstack-config-secret\") pod \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.373599 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-ceph\") pod \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.373707 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-ephemeral-workdir\") pod \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.373749 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-clouds-config\") pod \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.373805 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-ephemeral-temporary\") pod \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.373832 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.373875 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-public-key\") pod \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.373911 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-private-key\") pod \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.373945 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-kubeconfig\") pod \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\" (UID: \"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d\") " Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.376426 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" (UID: "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.388871 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" (UID: "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.388889 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-kube-api-access-bzfz2" (OuterVolumeSpecName: "kube-api-access-bzfz2") pod "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" (UID: "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d"). InnerVolumeSpecName "kube-api-access-bzfz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.388879 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-ceph" (OuterVolumeSpecName: "ceph") pod "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" (UID: "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.404406 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-config" (OuterVolumeSpecName: "tobiko-config") pod "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" (UID: "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d"). InnerVolumeSpecName "tobiko-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.411708 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-public-key" (OuterVolumeSpecName: "tobiko-public-key") pod "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" (UID: "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d"). InnerVolumeSpecName "tobiko-public-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.421923 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-private-key" (OuterVolumeSpecName: "tobiko-private-key") pod "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" (UID: "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d"). InnerVolumeSpecName "tobiko-private-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.424394 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" (UID: "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.431872 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-kubeconfig" (OuterVolumeSpecName: "kubeconfig") pod "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" (UID: "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d"). InnerVolumeSpecName "kubeconfig". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.435647 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" (UID: "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.438807 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" (UID: "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.477594 4913 reconciler_common.go:293] "Volume detached for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-kubeconfig\") on node \"crc\" DevicePath \"\"" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.477653 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzfz2\" (UniqueName: \"kubernetes.io/projected/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-kube-api-access-bzfz2\") on node \"crc\" DevicePath \"\"" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.477667 4913 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.477679 4913 reconciler_common.go:293] "Volume detached for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.477694 4913 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.477706 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.477731 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.477753 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.477838 4913 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.477855 4913 reconciler_common.go:293] "Volume detached for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-public-key\") on node \"crc\" DevicePath \"\"" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.477871 4913 reconciler_common.go:293] "Volume detached for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-tobiko-private-key\") on node \"crc\" DevicePath \"\"" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.503443 4913 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.582224 4913 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.725429 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"d2b76a33-156d-4aa9-9ee3-c17ac4bc753d","Type":"ContainerDied","Data":"6deb53370cc0478b36ca7770315906d9bc6e3b2c7b0657ee641718f04e3996c2"} Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.725467 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6deb53370cc0478b36ca7770315906d9bc6e3b2c7b0657ee641718f04e3996c2" Oct 01 14:17:12 crc kubenswrapper[4913]: I1001 14:17:12.725809 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Oct 01 14:17:13 crc kubenswrapper[4913]: I1001 14:17:13.914080 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" (UID: "d2b76a33-156d-4aa9-9ee3-c17ac4bc753d"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:17:14 crc kubenswrapper[4913]: I1001 14:17:14.013477 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2b76a33-156d-4aa9-9ee3-c17ac4bc753d-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.443882 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Oct 01 14:17:24 crc kubenswrapper[4913]: E1001 14:17:24.445001 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87bd280-9ffb-4f32-b64a-ce359e6965a5" containerName="extract-content" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.445018 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87bd280-9ffb-4f32-b64a-ce359e6965a5" containerName="extract-content" Oct 01 14:17:24 crc kubenswrapper[4913]: E1001 14:17:24.445041 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87bd280-9ffb-4f32-b64a-ce359e6965a5" containerName="registry-server" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.445049 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87bd280-9ffb-4f32-b64a-ce359e6965a5" containerName="registry-server" Oct 01 14:17:24 crc kubenswrapper[4913]: E1001 14:17:24.445070 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a95f32e-2336-4f26-8e9c-c7ac2fb56223" containerName="extract-utilities" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.445079 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a95f32e-2336-4f26-8e9c-c7ac2fb56223" containerName="extract-utilities" Oct 01 14:17:24 crc kubenswrapper[4913]: E1001 14:17:24.445100 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87bd280-9ffb-4f32-b64a-ce359e6965a5" containerName="extract-utilities" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.445110 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87bd280-9ffb-4f32-b64a-ce359e6965a5" containerName="extract-utilities" Oct 01 14:17:24 crc kubenswrapper[4913]: E1001 14:17:24.445119 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a95f32e-2336-4f26-8e9c-c7ac2fb56223" containerName="registry-server" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.445128 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a95f32e-2336-4f26-8e9c-c7ac2fb56223" containerName="registry-server" Oct 01 14:17:24 crc kubenswrapper[4913]: E1001 14:17:24.445141 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a95f32e-2336-4f26-8e9c-c7ac2fb56223" containerName="extract-content" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.445148 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a95f32e-2336-4f26-8e9c-c7ac2fb56223" containerName="extract-content" Oct 01 14:17:24 crc kubenswrapper[4913]: E1001 14:17:24.445163 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" containerName="tobiko-tests-tobiko" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.445170 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" containerName="tobiko-tests-tobiko" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.445428 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b76a33-156d-4aa9-9ee3-c17ac4bc753d" containerName="tobiko-tests-tobiko" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.445448 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87bd280-9ffb-4f32-b64a-ce359e6965a5" containerName="registry-server" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.445472 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a95f32e-2336-4f26-8e9c-c7ac2fb56223" containerName="registry-server" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.446234 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.453051 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.599644 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"aa3d2c62-1c98-4879-bb9f-c9d48d6ad57a\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.599701 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nwv8\" (UniqueName: \"kubernetes.io/projected/aa3d2c62-1c98-4879-bb9f-c9d48d6ad57a-kube-api-access-9nwv8\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"aa3d2c62-1c98-4879-bb9f-c9d48d6ad57a\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.701998 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"aa3d2c62-1c98-4879-bb9f-c9d48d6ad57a\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.702112 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nwv8\" (UniqueName: \"kubernetes.io/projected/aa3d2c62-1c98-4879-bb9f-c9d48d6ad57a-kube-api-access-9nwv8\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"aa3d2c62-1c98-4879-bb9f-c9d48d6ad57a\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.702647 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"aa3d2c62-1c98-4879-bb9f-c9d48d6ad57a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.724811 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nwv8\" (UniqueName: \"kubernetes.io/projected/aa3d2c62-1c98-4879-bb9f-c9d48d6ad57a-kube-api-access-9nwv8\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"aa3d2c62-1c98-4879-bb9f-c9d48d6ad57a\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.734104 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"aa3d2c62-1c98-4879-bb9f-c9d48d6ad57a\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 01 14:17:24 crc kubenswrapper[4913]: I1001 14:17:24.821719 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Oct 01 14:17:25 crc kubenswrapper[4913]: I1001 14:17:25.251660 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Oct 01 14:17:25 crc kubenswrapper[4913]: I1001 14:17:25.263211 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:17:25 crc kubenswrapper[4913]: I1001 14:17:25.885484 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" event={"ID":"aa3d2c62-1c98-4879-bb9f-c9d48d6ad57a","Type":"ContainerStarted","Data":"41d06dda5887b052235bd238e32f8aca254724c55992e34205fcc1e48e2bdb8f"} Oct 01 14:17:25 crc kubenswrapper[4913]: I1001 14:17:25.885719 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" event={"ID":"aa3d2c62-1c98-4879-bb9f-c9d48d6ad57a","Type":"ContainerStarted","Data":"e66ab598f407f37ef30970789a05413334e4ca13b2ad51ba82619e6c9d411f16"} Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.073188 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" podStartSLOduration=20.661236606 podStartE2EDuration="21.073162281s" podCreationTimestamp="2025-10-01 14:17:24 +0000 UTC" firstStartedPulling="2025-10-01 14:17:25.262995053 +0000 UTC m=+5977.166470631" lastFinishedPulling="2025-10-01 14:17:25.674920728 +0000 UTC m=+5977.578396306" observedRunningTime="2025-10-01 14:17:25.907171697 +0000 UTC m=+5977.810647275" watchObservedRunningTime="2025-10-01 14:17:45.073162281 +0000 UTC m=+5996.976637869" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.084003 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ansibletest-ansibletest"] Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.085603 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.088006 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.088084 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.096858 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ansibletest-ansibletest"] Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.151551 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5dc4653a-10fc-461a-a058-44cb58eb7847-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.151656 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.151720 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5dc4653a-10fc-461a-a058-44cb58eb7847-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.151768 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.151856 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5dc4653a-10fc-461a-a058-44cb58eb7847-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.152011 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.152107 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-ceph\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.152166 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.152366 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8zwd\" (UniqueName: \"kubernetes.io/projected/5dc4653a-10fc-461a-a058-44cb58eb7847-kube-api-access-f8zwd\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.152435 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.254311 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.254421 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8zwd\" (UniqueName: \"kubernetes.io/projected/5dc4653a-10fc-461a-a058-44cb58eb7847-kube-api-access-f8zwd\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.254458 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.254528 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5dc4653a-10fc-461a-a058-44cb58eb7847-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.254559 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.254588 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5dc4653a-10fc-461a-a058-44cb58eb7847-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.254622 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.254673 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5dc4653a-10fc-461a-a058-44cb58eb7847-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.254726 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.254762 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-ceph\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.254931 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.260236 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5dc4653a-10fc-461a-a058-44cb58eb7847-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.260869 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5dc4653a-10fc-461a-a058-44cb58eb7847-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.260944 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5dc4653a-10fc-461a-a058-44cb58eb7847-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.262612 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.263070 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.264101 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-ceph\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.266307 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.273398 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.275397 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8zwd\" (UniqueName: \"kubernetes.io/projected/5dc4653a-10fc-461a-a058-44cb58eb7847-kube-api-access-f8zwd\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.291886 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ansibletest-ansibletest\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.411281 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Oct 01 14:17:45 crc kubenswrapper[4913]: I1001 14:17:45.865315 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ansibletest-ansibletest"] Oct 01 14:17:46 crc kubenswrapper[4913]: I1001 14:17:46.058864 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"5dc4653a-10fc-461a-a058-44cb58eb7847","Type":"ContainerStarted","Data":"40499e5129ec53af20a425dc0fef1730839956f64aa60cc0eb3d2fedfdb6d1af"} Oct 01 14:18:13 crc kubenswrapper[4913]: E1001 14:18:13.331868 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified" Oct 01 14:18:13 crc kubenswrapper[4913]: E1001 14:18:13.332536 4913 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 01 14:18:13 crc kubenswrapper[4913]: container &Container{Name:ansibletest-ansibletest,Image:quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_ANSIBLE_EXTRA_VARS,Value:-e manual_run=false,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_FILE_EXTRA_VARS,Value:--- Oct 01 14:18:13 crc kubenswrapper[4913]: foo: bar Oct 01 14:18:13 crc kubenswrapper[4913]: ,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_GIT_BRANCH,Value:,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_GIT_REPO,Value:https://github.com/ansible/test-playbooks,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_INVENTORY,Value:localhost ansible_connection=local ansible_python_interpreter=python3 Oct 01 14:18:13 crc kubenswrapper[4913]: ,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_PLAYBOOK,Value:./debug.yml,ValueFrom:nil,},EnvVar{Name:POD_DEBUG,Value:false,ValueFrom:nil,},EnvVar{Name:POD_INSTALL_COLLECTIONS,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{4 0} {} 4 DecimalSI},memory: {{4294967296 0} {} 4Gi BinarySI},},Requests:ResourceList{cpu: {{2 0} {} 2 DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/ansible,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/AnsibleTests/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/ansible/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/var/lib/ansible/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ca-bundle.trust.crt,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:workload-ssh-secret,ReadOnly:true,MountPath:/var/lib/ansible/test_keypair.key,SubPath:ssh-privatekey,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:compute-ssh-secret,ReadOnly:true,MountPath:/var/lib/ansible/.ssh/compute_id,SubPath:ssh-privatekey,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f8zwd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*227,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*227,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ansibletest-ansibletest_openstack(5dc4653a-10fc-461a-a058-44cb58eb7847): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Oct 01 14:18:13 crc kubenswrapper[4913]: > logger="UnhandledError" Oct 01 14:18:13 crc kubenswrapper[4913]: E1001 14:18:13.333760 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ansibletest-ansibletest\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ansibletest-ansibletest" podUID="5dc4653a-10fc-461a-a058-44cb58eb7847" Oct 01 14:18:14 crc kubenswrapper[4913]: E1001 14:18:14.305677 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ansibletest-ansibletest\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified\\\"\"" pod="openstack/ansibletest-ansibletest" podUID="5dc4653a-10fc-461a-a058-44cb58eb7847" Oct 01 14:18:26 crc kubenswrapper[4913]: I1001 14:18:26.398569 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"5dc4653a-10fc-461a-a058-44cb58eb7847","Type":"ContainerStarted","Data":"8beede9a3096e65b18424549d558071fdddd59107479892dee0dc056951a1613"} Oct 01 14:18:26 crc kubenswrapper[4913]: I1001 14:18:26.420430 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ansibletest-ansibletest" podStartSLOduration=2.915730292 podStartE2EDuration="42.420410383s" podCreationTimestamp="2025-10-01 14:17:44 +0000 UTC" firstStartedPulling="2025-10-01 14:17:45.878955939 +0000 UTC m=+5997.782431517" lastFinishedPulling="2025-10-01 14:18:25.38363603 +0000 UTC m=+6037.287111608" observedRunningTime="2025-10-01 14:18:26.415069322 +0000 UTC m=+6038.318544920" watchObservedRunningTime="2025-10-01 14:18:26.420410383 +0000 UTC m=+6038.323885961" Oct 01 14:18:28 crc kubenswrapper[4913]: I1001 14:18:28.418026 4913 generic.go:334] "Generic (PLEG): container finished" podID="5dc4653a-10fc-461a-a058-44cb58eb7847" containerID="8beede9a3096e65b18424549d558071fdddd59107479892dee0dc056951a1613" exitCode=0 Oct 01 14:18:28 crc kubenswrapper[4913]: I1001 14:18:28.418111 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"5dc4653a-10fc-461a-a058-44cb58eb7847","Type":"ContainerDied","Data":"8beede9a3096e65b18424549d558071fdddd59107479892dee0dc056951a1613"} Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.769966 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.875145 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5dc4653a-10fc-461a-a058-44cb58eb7847-openstack-config\") pod \"5dc4653a-10fc-461a-a058-44cb58eb7847\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.875247 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5dc4653a-10fc-461a-a058-44cb58eb7847-test-operator-ephemeral-workdir\") pod \"5dc4653a-10fc-461a-a058-44cb58eb7847\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.875338 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8zwd\" (UniqueName: \"kubernetes.io/projected/5dc4653a-10fc-461a-a058-44cb58eb7847-kube-api-access-f8zwd\") pod \"5dc4653a-10fc-461a-a058-44cb58eb7847\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.875388 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-ca-certs\") pod \"5dc4653a-10fc-461a-a058-44cb58eb7847\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.875444 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-workload-ssh-secret\") pod \"5dc4653a-10fc-461a-a058-44cb58eb7847\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.875531 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-openstack-config-secret\") pod \"5dc4653a-10fc-461a-a058-44cb58eb7847\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.875551 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"5dc4653a-10fc-461a-a058-44cb58eb7847\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.875587 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5dc4653a-10fc-461a-a058-44cb58eb7847-test-operator-ephemeral-temporary\") pod \"5dc4653a-10fc-461a-a058-44cb58eb7847\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.875649 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-compute-ssh-secret\") pod \"5dc4653a-10fc-461a-a058-44cb58eb7847\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.875692 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-ceph\") pod \"5dc4653a-10fc-461a-a058-44cb58eb7847\" (UID: \"5dc4653a-10fc-461a-a058-44cb58eb7847\") " Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.876118 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc4653a-10fc-461a-a058-44cb58eb7847-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "5dc4653a-10fc-461a-a058-44cb58eb7847" (UID: "5dc4653a-10fc-461a-a058-44cb58eb7847"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.876590 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5dc4653a-10fc-461a-a058-44cb58eb7847-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.884058 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc4653a-10fc-461a-a058-44cb58eb7847-kube-api-access-f8zwd" (OuterVolumeSpecName: "kube-api-access-f8zwd") pod "5dc4653a-10fc-461a-a058-44cb58eb7847" (UID: "5dc4653a-10fc-461a-a058-44cb58eb7847"). InnerVolumeSpecName "kube-api-access-f8zwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.884705 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-ceph" (OuterVolumeSpecName: "ceph") pod "5dc4653a-10fc-461a-a058-44cb58eb7847" (UID: "5dc4653a-10fc-461a-a058-44cb58eb7847"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.884861 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "5dc4653a-10fc-461a-a058-44cb58eb7847" (UID: "5dc4653a-10fc-461a-a058-44cb58eb7847"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.896157 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc4653a-10fc-461a-a058-44cb58eb7847-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "5dc4653a-10fc-461a-a058-44cb58eb7847" (UID: "5dc4653a-10fc-461a-a058-44cb58eb7847"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.908786 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-workload-ssh-secret" (OuterVolumeSpecName: "workload-ssh-secret") pod "5dc4653a-10fc-461a-a058-44cb58eb7847" (UID: "5dc4653a-10fc-461a-a058-44cb58eb7847"). InnerVolumeSpecName "workload-ssh-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.909340 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5dc4653a-10fc-461a-a058-44cb58eb7847" (UID: "5dc4653a-10fc-461a-a058-44cb58eb7847"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.919703 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-compute-ssh-secret" (OuterVolumeSpecName: "compute-ssh-secret") pod "5dc4653a-10fc-461a-a058-44cb58eb7847" (UID: "5dc4653a-10fc-461a-a058-44cb58eb7847"). InnerVolumeSpecName "compute-ssh-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.945554 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dc4653a-10fc-461a-a058-44cb58eb7847-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5dc4653a-10fc-461a-a058-44cb58eb7847" (UID: "5dc4653a-10fc-461a-a058-44cb58eb7847"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.956379 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "5dc4653a-10fc-461a-a058-44cb58eb7847" (UID: "5dc4653a-10fc-461a-a058-44cb58eb7847"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.979917 4913 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5dc4653a-10fc-461a-a058-44cb58eb7847-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.979959 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5dc4653a-10fc-461a-a058-44cb58eb7847-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.979974 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8zwd\" (UniqueName: \"kubernetes.io/projected/5dc4653a-10fc-461a-a058-44cb58eb7847-kube-api-access-f8zwd\") on node \"crc\" DevicePath \"\"" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.979986 4913 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.980000 4913 reconciler_common.go:293] "Volume detached for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-workload-ssh-secret\") on node \"crc\" DevicePath \"\"" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.980033 4913 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.980078 4913 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.980116 4913 reconciler_common.go:293] "Volume detached for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-compute-ssh-secret\") on node \"crc\" DevicePath \"\"" Oct 01 14:18:29 crc kubenswrapper[4913]: I1001 14:18:29.980152 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5dc4653a-10fc-461a-a058-44cb58eb7847-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 14:18:30 crc kubenswrapper[4913]: I1001 14:18:30.005723 4913 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 01 14:18:30 crc kubenswrapper[4913]: I1001 14:18:30.082960 4913 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 01 14:18:30 crc kubenswrapper[4913]: I1001 14:18:30.435770 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"5dc4653a-10fc-461a-a058-44cb58eb7847","Type":"ContainerDied","Data":"40499e5129ec53af20a425dc0fef1730839956f64aa60cc0eb3d2fedfdb6d1af"} Oct 01 14:18:30 crc kubenswrapper[4913]: I1001 14:18:30.436383 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40499e5129ec53af20a425dc0fef1730839956f64aa60cc0eb3d2fedfdb6d1af" Oct 01 14:18:30 crc kubenswrapper[4913]: I1001 14:18:30.436517 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Oct 01 14:18:41 crc kubenswrapper[4913]: I1001 14:18:41.181366 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Oct 01 14:18:41 crc kubenswrapper[4913]: E1001 14:18:41.182414 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc4653a-10fc-461a-a058-44cb58eb7847" containerName="ansibletest-ansibletest" Oct 01 14:18:41 crc kubenswrapper[4913]: I1001 14:18:41.182439 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc4653a-10fc-461a-a058-44cb58eb7847" containerName="ansibletest-ansibletest" Oct 01 14:18:41 crc kubenswrapper[4913]: I1001 14:18:41.182672 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc4653a-10fc-461a-a058-44cb58eb7847" containerName="ansibletest-ansibletest" Oct 01 14:18:41 crc kubenswrapper[4913]: I1001 14:18:41.183470 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 01 14:18:41 crc kubenswrapper[4913]: I1001 14:18:41.199631 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Oct 01 14:18:41 crc kubenswrapper[4913]: I1001 14:18:41.308370 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz2sb\" (UniqueName: \"kubernetes.io/projected/96a792f4-42c4-4e02-8ac3-e49612ad6a30-kube-api-access-pz2sb\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"96a792f4-42c4-4e02-8ac3-e49612ad6a30\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 01 14:18:41 crc kubenswrapper[4913]: I1001 14:18:41.308530 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"96a792f4-42c4-4e02-8ac3-e49612ad6a30\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 01 14:18:41 crc kubenswrapper[4913]: I1001 14:18:41.410020 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"96a792f4-42c4-4e02-8ac3-e49612ad6a30\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 01 14:18:41 crc kubenswrapper[4913]: I1001 14:18:41.410114 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz2sb\" (UniqueName: \"kubernetes.io/projected/96a792f4-42c4-4e02-8ac3-e49612ad6a30-kube-api-access-pz2sb\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"96a792f4-42c4-4e02-8ac3-e49612ad6a30\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 01 14:18:41 crc kubenswrapper[4913]: I1001 14:18:41.410588 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"96a792f4-42c4-4e02-8ac3-e49612ad6a30\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 01 14:18:41 crc kubenswrapper[4913]: I1001 14:18:41.433924 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz2sb\" (UniqueName: \"kubernetes.io/projected/96a792f4-42c4-4e02-8ac3-e49612ad6a30-kube-api-access-pz2sb\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"96a792f4-42c4-4e02-8ac3-e49612ad6a30\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 01 14:18:41 crc kubenswrapper[4913]: I1001 14:18:41.438200 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"96a792f4-42c4-4e02-8ac3-e49612ad6a30\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 01 14:18:41 crc kubenswrapper[4913]: I1001 14:18:41.508408 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Oct 01 14:18:41 crc kubenswrapper[4913]: I1001 14:18:41.940765 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Oct 01 14:18:42 crc kubenswrapper[4913]: I1001 14:18:42.545754 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" event={"ID":"96a792f4-42c4-4e02-8ac3-e49612ad6a30","Type":"ContainerStarted","Data":"d95771e037842a795bd7afab277a76ebdee80f24ff2ec49eb3e70181f18a3a8e"} Oct 01 14:18:43 crc kubenswrapper[4913]: I1001 14:18:43.555986 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" event={"ID":"96a792f4-42c4-4e02-8ac3-e49612ad6a30","Type":"ContainerStarted","Data":"07255740bcb8f38406cebcbe0de67ca4186773a38f083fc4719bf5a0deb0e9d3"} Oct 01 14:18:43 crc kubenswrapper[4913]: I1001 14:18:43.576025 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" podStartSLOduration=1.930584036 podStartE2EDuration="2.576003484s" podCreationTimestamp="2025-10-01 14:18:41 +0000 UTC" firstStartedPulling="2025-10-01 14:18:41.943992022 +0000 UTC m=+6053.847467600" lastFinishedPulling="2025-10-01 14:18:42.58941147 +0000 UTC m=+6054.492887048" observedRunningTime="2025-10-01 14:18:43.57089496 +0000 UTC m=+6055.474370648" watchObservedRunningTime="2025-10-01 14:18:43.576003484 +0000 UTC m=+6055.479479072" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.301252 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizontest-tests-horizontest"] Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.303169 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.305015 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"test-operator-clouds-config" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.305237 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizontest-tests-horizontesthorizontest-config" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.324470 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizontest-tests-horizontest"] Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.414766 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g62tj\" (UniqueName: \"kubernetes.io/projected/3d11e9df-3564-44cc-a29f-9d6b3f043853-kube-api-access-g62tj\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.414841 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.414936 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.414960 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.415000 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.415032 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.415054 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.415202 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.516976 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.517063 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g62tj\" (UniqueName: \"kubernetes.io/projected/3d11e9df-3564-44cc-a29f-9d6b3f043853-kube-api-access-g62tj\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.517114 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.517178 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.517206 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.517247 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.517295 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.517319 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.517622 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.518117 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.518448 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.520097 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.524560 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.524585 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.526111 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.536985 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g62tj\" (UniqueName: \"kubernetes.io/projected/3d11e9df-3564-44cc-a29f-9d6b3f043853-kube-api-access-g62tj\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.554382 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"horizontest-tests-horizontest\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:01 crc kubenswrapper[4913]: I1001 14:19:01.633977 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Oct 01 14:19:02 crc kubenswrapper[4913]: I1001 14:19:02.064834 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizontest-tests-horizontest"] Oct 01 14:19:02 crc kubenswrapper[4913]: W1001 14:19:02.069706 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d11e9df_3564_44cc_a29f_9d6b3f043853.slice/crio-44790b558158efe1aa11daa1edb8af87b34c0bf594d8159feb9c1e49d7800f42 WatchSource:0}: Error finding container 44790b558158efe1aa11daa1edb8af87b34c0bf594d8159feb9c1e49d7800f42: Status 404 returned error can't find the container with id 44790b558158efe1aa11daa1edb8af87b34c0bf594d8159feb9c1e49d7800f42 Oct 01 14:19:02 crc kubenswrapper[4913]: I1001 14:19:02.727921 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"3d11e9df-3564-44cc-a29f-9d6b3f043853","Type":"ContainerStarted","Data":"44790b558158efe1aa11daa1edb8af87b34c0bf594d8159feb9c1e49d7800f42"} Oct 01 14:19:10 crc kubenswrapper[4913]: I1001 14:19:10.083639 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:19:10 crc kubenswrapper[4913]: I1001 14:19:10.084254 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:19:37 crc kubenswrapper[4913]: E1001 14:19:37.952810 4913 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizontest:current-podified" Oct 01 14:19:37 crc kubenswrapper[4913]: E1001 14:19:37.953557 4913 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizontest-tests-horizontest,Image:quay.io/podified-antelope-centos9/openstack-horizontest:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADMIN_PASSWORD,Value:12345678,ValueFrom:nil,},EnvVar{Name:ADMIN_USERNAME,Value:admin,ValueFrom:nil,},EnvVar{Name:AUTH_URL,Value:https://keystone-public-openstack.apps-crc.testing,ValueFrom:nil,},EnvVar{Name:DASHBOARD_URL,Value:https://horizon-openstack.apps-crc.testing/,ValueFrom:nil,},EnvVar{Name:EXTRA_FLAG,Value:not pagination and test_users.py,ValueFrom:nil,},EnvVar{Name:FLAVOR_NAME,Value:m1.tiny,ValueFrom:nil,},EnvVar{Name:HORIZONTEST_DEBUG_MODE,Value:false,ValueFrom:nil,},EnvVar{Name:HORIZON_KEYS_FOLDER,Value:/etc/test_operator,ValueFrom:nil,},EnvVar{Name:HORIZON_LOGS_DIR_NAME,Value:horizon,ValueFrom:nil,},EnvVar{Name:HORIZON_REPO_BRANCH,Value:master,ValueFrom:nil,},EnvVar{Name:IMAGE_FILE,Value:/var/lib/horizontest/cirros-0.6.2-x86_64-disk.img,ValueFrom:nil,},EnvVar{Name:IMAGE_FILE_NAME,Value:cirros-0.6.2-x86_64-disk,ValueFrom:nil,},EnvVar{Name:IMAGE_URL,Value:http://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img,ValueFrom:nil,},EnvVar{Name:PASSWORD,Value:horizontest,ValueFrom:nil,},EnvVar{Name:PROJECT_NAME,Value:horizontest,ValueFrom:nil,},EnvVar{Name:PROJECT_NAME_XPATH,Value://*[@class=\"context-project\"]//ancestor::ul,ValueFrom:nil,},EnvVar{Name:REPO_URL,Value:https://review.opendev.org/openstack/horizon,ValueFrom:nil,},EnvVar{Name:USER_NAME,Value:horizontest,ValueFrom:nil,},EnvVar{Name:USE_EXTERNAL_FILES,Value:True,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{2 0} {} 2 DecimalSI},memory: {{4294967296 0} {} 4Gi BinarySI},},Requests:ResourceList{cpu: {{1 0} {} 1 DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/horizontest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/horizontest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-clouds-config,ReadOnly:true,MountPath:/var/lib/horizontest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-clouds-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ca-bundle.trust.crt,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g62tj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42455,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42455,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizontest-tests-horizontest_openstack(3d11e9df-3564-44cc-a29f-9d6b3f043853): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 14:19:37 crc kubenswrapper[4913]: E1001 14:19:37.954961 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizontest-tests-horizontest\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/horizontest-tests-horizontest" podUID="3d11e9df-3564-44cc-a29f-9d6b3f043853" Oct 01 14:19:38 crc kubenswrapper[4913]: E1001 14:19:38.094409 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizontest-tests-horizontest\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizontest:current-podified\\\"\"" pod="openstack/horizontest-tests-horizontest" podUID="3d11e9df-3564-44cc-a29f-9d6b3f043853" Oct 01 14:19:40 crc kubenswrapper[4913]: I1001 14:19:40.084094 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:19:40 crc kubenswrapper[4913]: I1001 14:19:40.084483 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:19:52 crc kubenswrapper[4913]: I1001 14:19:52.242302 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"3d11e9df-3564-44cc-a29f-9d6b3f043853","Type":"ContainerStarted","Data":"4b9d2cdb180ed08659ce4786556daa531b59c2ed32d36781d4b160f60a100187"} Oct 01 14:19:52 crc kubenswrapper[4913]: I1001 14:19:52.285837 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizontest-tests-horizontest" podStartSLOduration=3.832603336 podStartE2EDuration="52.285815643s" podCreationTimestamp="2025-10-01 14:19:00 +0000 UTC" firstStartedPulling="2025-10-01 14:19:02.072372909 +0000 UTC m=+6073.975848487" lastFinishedPulling="2025-10-01 14:19:50.525585216 +0000 UTC m=+6122.429060794" observedRunningTime="2025-10-01 14:19:52.274011045 +0000 UTC m=+6124.177486643" watchObservedRunningTime="2025-10-01 14:19:52.285815643 +0000 UTC m=+6124.189291221" Oct 01 14:20:10 crc kubenswrapper[4913]: I1001 14:20:10.084078 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:20:10 crc kubenswrapper[4913]: I1001 14:20:10.084635 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:20:10 crc kubenswrapper[4913]: I1001 14:20:10.084679 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 14:20:10 crc kubenswrapper[4913]: I1001 14:20:10.085519 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"504a2ad5746fe3c913dede622c0d007e14a82d5305df77d85d6a4fe2920436de"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:20:10 crc kubenswrapper[4913]: I1001 14:20:10.085575 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://504a2ad5746fe3c913dede622c0d007e14a82d5305df77d85d6a4fe2920436de" gracePeriod=600 Oct 01 14:20:10 crc kubenswrapper[4913]: I1001 14:20:10.408531 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="504a2ad5746fe3c913dede622c0d007e14a82d5305df77d85d6a4fe2920436de" exitCode=0 Oct 01 14:20:10 crc kubenswrapper[4913]: I1001 14:20:10.408643 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"504a2ad5746fe3c913dede622c0d007e14a82d5305df77d85d6a4fe2920436de"} Oct 01 14:20:10 crc kubenswrapper[4913]: I1001 14:20:10.408930 4913 scope.go:117] "RemoveContainer" containerID="a31296fefa6b66dd62df7d4aa553e56c9e40d778dc2672ea2d79d372b81ab5bb" Oct 01 14:20:11 crc kubenswrapper[4913]: I1001 14:20:11.418130 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6"} Oct 01 14:21:44 crc kubenswrapper[4913]: I1001 14:21:44.208050 4913 generic.go:334] "Generic (PLEG): container finished" podID="3d11e9df-3564-44cc-a29f-9d6b3f043853" containerID="4b9d2cdb180ed08659ce4786556daa531b59c2ed32d36781d4b160f60a100187" exitCode=0 Oct 01 14:21:44 crc kubenswrapper[4913]: I1001 14:21:44.208164 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"3d11e9df-3564-44cc-a29f-9d6b3f043853","Type":"ContainerDied","Data":"4b9d2cdb180ed08659ce4786556daa531b59c2ed32d36781d4b160f60a100187"} Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.549537 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.673235 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-ca-certs\") pod \"3d11e9df-3564-44cc-a29f-9d6b3f043853\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.673481 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-clouds-config\") pod \"3d11e9df-3564-44cc-a29f-9d6b3f043853\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.673788 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-ephemeral-workdir\") pod \"3d11e9df-3564-44cc-a29f-9d6b3f043853\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.673899 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-openstack-config-secret\") pod \"3d11e9df-3564-44cc-a29f-9d6b3f043853\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.673935 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g62tj\" (UniqueName: \"kubernetes.io/projected/3d11e9df-3564-44cc-a29f-9d6b3f043853-kube-api-access-g62tj\") pod \"3d11e9df-3564-44cc-a29f-9d6b3f043853\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.673961 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-ceph\") pod \"3d11e9df-3564-44cc-a29f-9d6b3f043853\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.674084 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-ephemeral-temporary\") pod \"3d11e9df-3564-44cc-a29f-9d6b3f043853\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.674201 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"3d11e9df-3564-44cc-a29f-9d6b3f043853\" (UID: \"3d11e9df-3564-44cc-a29f-9d6b3f043853\") " Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.674776 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "3d11e9df-3564-44cc-a29f-9d6b3f043853" (UID: "3d11e9df-3564-44cc-a29f-9d6b3f043853"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.675297 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.679184 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-ceph" (OuterVolumeSpecName: "ceph") pod "3d11e9df-3564-44cc-a29f-9d6b3f043853" (UID: "3d11e9df-3564-44cc-a29f-9d6b3f043853"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.679765 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "test-operator-logs") pod "3d11e9df-3564-44cc-a29f-9d6b3f043853" (UID: "3d11e9df-3564-44cc-a29f-9d6b3f043853"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.681045 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d11e9df-3564-44cc-a29f-9d6b3f043853-kube-api-access-g62tj" (OuterVolumeSpecName: "kube-api-access-g62tj") pod "3d11e9df-3564-44cc-a29f-9d6b3f043853" (UID: "3d11e9df-3564-44cc-a29f-9d6b3f043853"). InnerVolumeSpecName "kube-api-access-g62tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.701720 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "3d11e9df-3564-44cc-a29f-9d6b3f043853" (UID: "3d11e9df-3564-44cc-a29f-9d6b3f043853"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.722409 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "3d11e9df-3564-44cc-a29f-9d6b3f043853" (UID: "3d11e9df-3564-44cc-a29f-9d6b3f043853"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.731767 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "3d11e9df-3564-44cc-a29f-9d6b3f043853" (UID: "3d11e9df-3564-44cc-a29f-9d6b3f043853"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.777370 4913 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.777405 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.777419 4913 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.777431 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g62tj\" (UniqueName: \"kubernetes.io/projected/3d11e9df-3564-44cc-a29f-9d6b3f043853-kube-api-access-g62tj\") on node \"crc\" DevicePath \"\"" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.777443 4913 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3d11e9df-3564-44cc-a29f-9d6b3f043853-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.777475 4913 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.801240 4913 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.878857 4913 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.900366 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "3d11e9df-3564-44cc-a29f-9d6b3f043853" (UID: "3d11e9df-3564-44cc-a29f-9d6b3f043853"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:21:45 crc kubenswrapper[4913]: I1001 14:21:45.981070 4913 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3d11e9df-3564-44cc-a29f-9d6b3f043853-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 01 14:21:46 crc kubenswrapper[4913]: I1001 14:21:46.234789 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"3d11e9df-3564-44cc-a29f-9d6b3f043853","Type":"ContainerDied","Data":"44790b558158efe1aa11daa1edb8af87b34c0bf594d8159feb9c1e49d7800f42"} Oct 01 14:21:46 crc kubenswrapper[4913]: I1001 14:21:46.234827 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44790b558158efe1aa11daa1edb8af87b34c0bf594d8159feb9c1e49d7800f42" Oct 01 14:21:46 crc kubenswrapper[4913]: I1001 14:21:46.234836 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Oct 01 14:21:55 crc kubenswrapper[4913]: I1001 14:21:55.625191 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Oct 01 14:21:55 crc kubenswrapper[4913]: E1001 14:21:55.626348 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d11e9df-3564-44cc-a29f-9d6b3f043853" containerName="horizontest-tests-horizontest" Oct 01 14:21:55 crc kubenswrapper[4913]: I1001 14:21:55.626361 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d11e9df-3564-44cc-a29f-9d6b3f043853" containerName="horizontest-tests-horizontest" Oct 01 14:21:55 crc kubenswrapper[4913]: I1001 14:21:55.626575 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d11e9df-3564-44cc-a29f-9d6b3f043853" containerName="horizontest-tests-horizontest" Oct 01 14:21:55 crc kubenswrapper[4913]: I1001 14:21:55.627334 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 01 14:21:55 crc kubenswrapper[4913]: I1001 14:21:55.635524 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Oct 01 14:21:55 crc kubenswrapper[4913]: I1001 14:21:55.795151 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6xkv\" (UniqueName: \"kubernetes.io/projected/d933ac8f-d113-4120-8848-65d6c2affdac-kube-api-access-x6xkv\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"d933ac8f-d113-4120-8848-65d6c2affdac\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 01 14:21:55 crc kubenswrapper[4913]: I1001 14:21:55.795374 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"d933ac8f-d113-4120-8848-65d6c2affdac\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 01 14:21:55 crc kubenswrapper[4913]: I1001 14:21:55.897056 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6xkv\" (UniqueName: \"kubernetes.io/projected/d933ac8f-d113-4120-8848-65d6c2affdac-kube-api-access-x6xkv\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"d933ac8f-d113-4120-8848-65d6c2affdac\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 01 14:21:55 crc kubenswrapper[4913]: I1001 14:21:55.897118 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"d933ac8f-d113-4120-8848-65d6c2affdac\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 01 14:21:55 crc kubenswrapper[4913]: I1001 14:21:55.897661 4913 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"d933ac8f-d113-4120-8848-65d6c2affdac\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 01 14:21:55 crc kubenswrapper[4913]: I1001 14:21:55.930536 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6xkv\" (UniqueName: \"kubernetes.io/projected/d933ac8f-d113-4120-8848-65d6c2affdac-kube-api-access-x6xkv\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"d933ac8f-d113-4120-8848-65d6c2affdac\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 01 14:21:55 crc kubenswrapper[4913]: I1001 14:21:55.954739 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"d933ac8f-d113-4120-8848-65d6c2affdac\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 01 14:21:56 crc kubenswrapper[4913]: I1001 14:21:56.246401 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Oct 01 14:21:56 crc kubenswrapper[4913]: E1001 14:21:56.246758 4913 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 01 14:21:56 crc kubenswrapper[4913]: I1001 14:21:56.667897 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Oct 01 14:21:56 crc kubenswrapper[4913]: E1001 14:21:56.675110 4913 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 01 14:21:57 crc kubenswrapper[4913]: E1001 14:21:57.150306 4913 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 01 14:21:57 crc kubenswrapper[4913]: I1001 14:21:57.353410 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" event={"ID":"d933ac8f-d113-4120-8848-65d6c2affdac","Type":"ContainerStarted","Data":"07d380dc32661ae41b37672f28c2ff0a4026ea2d4e69d9841f6a00143e9f5592"} Oct 01 14:21:57 crc kubenswrapper[4913]: I1001 14:21:57.353785 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" event={"ID":"d933ac8f-d113-4120-8848-65d6c2affdac","Type":"ContainerStarted","Data":"e8bea650c72a402448f26c11efc76d42977c25e2912dfefb027b42d156247266"} Oct 01 14:21:57 crc kubenswrapper[4913]: E1001 14:21:57.354540 4913 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 01 14:21:58 crc kubenswrapper[4913]: E1001 14:21:58.361499 4913 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 01 14:22:10 crc kubenswrapper[4913]: I1001 14:22:10.083946 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:22:10 crc kubenswrapper[4913]: I1001 14:22:10.084533 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:22:12 crc kubenswrapper[4913]: I1001 14:22:12.769829 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" podStartSLOduration=17.296329092 podStartE2EDuration="17.769813444s" podCreationTimestamp="2025-10-01 14:21:55 +0000 UTC" firstStartedPulling="2025-10-01 14:21:56.67665632 +0000 UTC m=+6248.580131898" lastFinishedPulling="2025-10-01 14:21:57.150140672 +0000 UTC m=+6249.053616250" observedRunningTime="2025-10-01 14:21:57.368649507 +0000 UTC m=+6249.272125105" watchObservedRunningTime="2025-10-01 14:22:12.769813444 +0000 UTC m=+6264.673289012" Oct 01 14:22:12 crc kubenswrapper[4913]: I1001 14:22:12.778104 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wvf2r"] Oct 01 14:22:12 crc kubenswrapper[4913]: I1001 14:22:12.782033 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:12 crc kubenswrapper[4913]: I1001 14:22:12.792622 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvf2r"] Oct 01 14:22:12 crc kubenswrapper[4913]: I1001 14:22:12.953452 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhshf\" (UniqueName: \"kubernetes.io/projected/0c370516-b043-4b97-a51e-56bd46f86846-kube-api-access-zhshf\") pod \"redhat-operators-wvf2r\" (UID: \"0c370516-b043-4b97-a51e-56bd46f86846\") " pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:12 crc kubenswrapper[4913]: I1001 14:22:12.953521 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c370516-b043-4b97-a51e-56bd46f86846-catalog-content\") pod \"redhat-operators-wvf2r\" (UID: \"0c370516-b043-4b97-a51e-56bd46f86846\") " pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:12 crc kubenswrapper[4913]: I1001 14:22:12.953542 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c370516-b043-4b97-a51e-56bd46f86846-utilities\") pod \"redhat-operators-wvf2r\" (UID: \"0c370516-b043-4b97-a51e-56bd46f86846\") " pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:13 crc kubenswrapper[4913]: I1001 14:22:13.054825 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhshf\" (UniqueName: \"kubernetes.io/projected/0c370516-b043-4b97-a51e-56bd46f86846-kube-api-access-zhshf\") pod \"redhat-operators-wvf2r\" (UID: \"0c370516-b043-4b97-a51e-56bd46f86846\") " pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:13 crc kubenswrapper[4913]: I1001 14:22:13.054882 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c370516-b043-4b97-a51e-56bd46f86846-catalog-content\") pod \"redhat-operators-wvf2r\" (UID: \"0c370516-b043-4b97-a51e-56bd46f86846\") " pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:13 crc kubenswrapper[4913]: I1001 14:22:13.054903 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c370516-b043-4b97-a51e-56bd46f86846-utilities\") pod \"redhat-operators-wvf2r\" (UID: \"0c370516-b043-4b97-a51e-56bd46f86846\") " pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:13 crc kubenswrapper[4913]: I1001 14:22:13.055546 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c370516-b043-4b97-a51e-56bd46f86846-utilities\") pod \"redhat-operators-wvf2r\" (UID: \"0c370516-b043-4b97-a51e-56bd46f86846\") " pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:13 crc kubenswrapper[4913]: I1001 14:22:13.056088 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c370516-b043-4b97-a51e-56bd46f86846-catalog-content\") pod \"redhat-operators-wvf2r\" (UID: \"0c370516-b043-4b97-a51e-56bd46f86846\") " pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:13 crc kubenswrapper[4913]: I1001 14:22:13.074959 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhshf\" (UniqueName: \"kubernetes.io/projected/0c370516-b043-4b97-a51e-56bd46f86846-kube-api-access-zhshf\") pod \"redhat-operators-wvf2r\" (UID: \"0c370516-b043-4b97-a51e-56bd46f86846\") " pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:13 crc kubenswrapper[4913]: I1001 14:22:13.101465 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:13 crc kubenswrapper[4913]: I1001 14:22:13.567006 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvf2r"] Oct 01 14:22:13 crc kubenswrapper[4913]: W1001 14:22:13.578758 4913 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c370516_b043_4b97_a51e_56bd46f86846.slice/crio-460105255cb5e4e469a8f841854accfb34398dbfa8f389e5109fe72b5499a31b WatchSource:0}: Error finding container 460105255cb5e4e469a8f841854accfb34398dbfa8f389e5109fe72b5499a31b: Status 404 returned error can't find the container with id 460105255cb5e4e469a8f841854accfb34398dbfa8f389e5109fe72b5499a31b Oct 01 14:22:14 crc kubenswrapper[4913]: I1001 14:22:14.514021 4913 generic.go:334] "Generic (PLEG): container finished" podID="0c370516-b043-4b97-a51e-56bd46f86846" containerID="d897da92e8c569161c4b12aed68e33f1de2c6ab2c3012d76ccac6d9d68d739e6" exitCode=0 Oct 01 14:22:14 crc kubenswrapper[4913]: I1001 14:22:14.514079 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvf2r" event={"ID":"0c370516-b043-4b97-a51e-56bd46f86846","Type":"ContainerDied","Data":"d897da92e8c569161c4b12aed68e33f1de2c6ab2c3012d76ccac6d9d68d739e6"} Oct 01 14:22:14 crc kubenswrapper[4913]: I1001 14:22:14.514330 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvf2r" event={"ID":"0c370516-b043-4b97-a51e-56bd46f86846","Type":"ContainerStarted","Data":"460105255cb5e4e469a8f841854accfb34398dbfa8f389e5109fe72b5499a31b"} Oct 01 14:22:15 crc kubenswrapper[4913]: I1001 14:22:15.524948 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvf2r" event={"ID":"0c370516-b043-4b97-a51e-56bd46f86846","Type":"ContainerStarted","Data":"8216c93f51a76098661dd7b60171cca92e92e4849cd3c298001e3fa2a576cb75"} Oct 01 14:22:18 crc kubenswrapper[4913]: I1001 14:22:18.553061 4913 generic.go:334] "Generic (PLEG): container finished" podID="0c370516-b043-4b97-a51e-56bd46f86846" containerID="8216c93f51a76098661dd7b60171cca92e92e4849cd3c298001e3fa2a576cb75" exitCode=0 Oct 01 14:22:18 crc kubenswrapper[4913]: I1001 14:22:18.553134 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvf2r" event={"ID":"0c370516-b043-4b97-a51e-56bd46f86846","Type":"ContainerDied","Data":"8216c93f51a76098661dd7b60171cca92e92e4849cd3c298001e3fa2a576cb75"} Oct 01 14:22:19 crc kubenswrapper[4913]: I1001 14:22:19.565075 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvf2r" event={"ID":"0c370516-b043-4b97-a51e-56bd46f86846","Type":"ContainerStarted","Data":"3e83228578ac061dc698f389a57d9d930684bb63ded5239119a89dd26b6ee593"} Oct 01 14:22:19 crc kubenswrapper[4913]: I1001 14:22:19.587469 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wvf2r" podStartSLOduration=3.116043258 podStartE2EDuration="7.587440506s" podCreationTimestamp="2025-10-01 14:22:12 +0000 UTC" firstStartedPulling="2025-10-01 14:22:14.516920998 +0000 UTC m=+6266.420396576" lastFinishedPulling="2025-10-01 14:22:18.988318246 +0000 UTC m=+6270.891793824" observedRunningTime="2025-10-01 14:22:19.580362699 +0000 UTC m=+6271.483838307" watchObservedRunningTime="2025-10-01 14:22:19.587440506 +0000 UTC m=+6271.490916084" Oct 01 14:22:23 crc kubenswrapper[4913]: I1001 14:22:23.101779 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:23 crc kubenswrapper[4913]: I1001 14:22:23.102351 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:24 crc kubenswrapper[4913]: I1001 14:22:24.152384 4913 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wvf2r" podUID="0c370516-b043-4b97-a51e-56bd46f86846" containerName="registry-server" probeResult="failure" output=< Oct 01 14:22:24 crc kubenswrapper[4913]: timeout: failed to connect service ":50051" within 1s Oct 01 14:22:24 crc kubenswrapper[4913]: > Oct 01 14:22:24 crc kubenswrapper[4913]: I1001 14:22:24.303984 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fzk86/must-gather-9ddn5"] Oct 01 14:22:24 crc kubenswrapper[4913]: I1001 14:22:24.306152 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/must-gather-9ddn5" Oct 01 14:22:24 crc kubenswrapper[4913]: I1001 14:22:24.308370 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fzk86"/"default-dockercfg-qd9vb" Oct 01 14:22:24 crc kubenswrapper[4913]: I1001 14:22:24.308891 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fzk86"/"kube-root-ca.crt" Oct 01 14:22:24 crc kubenswrapper[4913]: I1001 14:22:24.308970 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fzk86"/"openshift-service-ca.crt" Oct 01 14:22:24 crc kubenswrapper[4913]: I1001 14:22:24.315523 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fzk86/must-gather-9ddn5"] Oct 01 14:22:24 crc kubenswrapper[4913]: I1001 14:22:24.474958 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/833aa627-cc45-4ac9-b59b-9b0116867977-must-gather-output\") pod \"must-gather-9ddn5\" (UID: \"833aa627-cc45-4ac9-b59b-9b0116867977\") " pod="openshift-must-gather-fzk86/must-gather-9ddn5" Oct 01 14:22:24 crc kubenswrapper[4913]: I1001 14:22:24.475086 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4rck\" (UniqueName: \"kubernetes.io/projected/833aa627-cc45-4ac9-b59b-9b0116867977-kube-api-access-v4rck\") pod \"must-gather-9ddn5\" (UID: \"833aa627-cc45-4ac9-b59b-9b0116867977\") " pod="openshift-must-gather-fzk86/must-gather-9ddn5" Oct 01 14:22:24 crc kubenswrapper[4913]: I1001 14:22:24.576510 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/833aa627-cc45-4ac9-b59b-9b0116867977-must-gather-output\") pod \"must-gather-9ddn5\" (UID: \"833aa627-cc45-4ac9-b59b-9b0116867977\") " pod="openshift-must-gather-fzk86/must-gather-9ddn5" Oct 01 14:22:24 crc kubenswrapper[4913]: I1001 14:22:24.576669 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4rck\" (UniqueName: \"kubernetes.io/projected/833aa627-cc45-4ac9-b59b-9b0116867977-kube-api-access-v4rck\") pod \"must-gather-9ddn5\" (UID: \"833aa627-cc45-4ac9-b59b-9b0116867977\") " pod="openshift-must-gather-fzk86/must-gather-9ddn5" Oct 01 14:22:24 crc kubenswrapper[4913]: I1001 14:22:24.577103 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/833aa627-cc45-4ac9-b59b-9b0116867977-must-gather-output\") pod \"must-gather-9ddn5\" (UID: \"833aa627-cc45-4ac9-b59b-9b0116867977\") " pod="openshift-must-gather-fzk86/must-gather-9ddn5" Oct 01 14:22:24 crc kubenswrapper[4913]: I1001 14:22:24.595105 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4rck\" (UniqueName: \"kubernetes.io/projected/833aa627-cc45-4ac9-b59b-9b0116867977-kube-api-access-v4rck\") pod \"must-gather-9ddn5\" (UID: \"833aa627-cc45-4ac9-b59b-9b0116867977\") " pod="openshift-must-gather-fzk86/must-gather-9ddn5" Oct 01 14:22:24 crc kubenswrapper[4913]: I1001 14:22:24.627888 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/must-gather-9ddn5" Oct 01 14:22:25 crc kubenswrapper[4913]: I1001 14:22:25.119602 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fzk86/must-gather-9ddn5"] Oct 01 14:22:25 crc kubenswrapper[4913]: I1001 14:22:25.619039 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fzk86/must-gather-9ddn5" event={"ID":"833aa627-cc45-4ac9-b59b-9b0116867977","Type":"ContainerStarted","Data":"b791867aa270a3fbcd952ea9dce7864dd22b5ad57bf49e96246b357df051b6b2"} Oct 01 14:22:30 crc kubenswrapper[4913]: I1001 14:22:30.671709 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fzk86/must-gather-9ddn5" event={"ID":"833aa627-cc45-4ac9-b59b-9b0116867977","Type":"ContainerStarted","Data":"f609ef08b1179a78e9facd28eab4cfd260c1cfaa292cd6dd984b5ecd9f8939dd"} Oct 01 14:22:30 crc kubenswrapper[4913]: I1001 14:22:30.672453 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fzk86/must-gather-9ddn5" event={"ID":"833aa627-cc45-4ac9-b59b-9b0116867977","Type":"ContainerStarted","Data":"76d46fec2319fad77761fe3136cbd7358fe4d94031ec542d7b9db1748e45062f"} Oct 01 14:22:30 crc kubenswrapper[4913]: I1001 14:22:30.696506 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fzk86/must-gather-9ddn5" podStartSLOduration=2.440213692 podStartE2EDuration="6.696477959s" podCreationTimestamp="2025-10-01 14:22:24 +0000 UTC" firstStartedPulling="2025-10-01 14:22:25.129252465 +0000 UTC m=+6277.032728053" lastFinishedPulling="2025-10-01 14:22:29.385516742 +0000 UTC m=+6281.288992320" observedRunningTime="2025-10-01 14:22:30.689416473 +0000 UTC m=+6282.592892081" watchObservedRunningTime="2025-10-01 14:22:30.696477959 +0000 UTC m=+6282.599953537" Oct 01 14:22:33 crc kubenswrapper[4913]: I1001 14:22:33.162850 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:33 crc kubenswrapper[4913]: I1001 14:22:33.221102 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:33 crc kubenswrapper[4913]: I1001 14:22:33.400502 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvf2r"] Oct 01 14:22:34 crc kubenswrapper[4913]: I1001 14:22:34.679733 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fzk86/crc-debug-qbd6b"] Oct 01 14:22:34 crc kubenswrapper[4913]: I1001 14:22:34.681707 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/crc-debug-qbd6b" Oct 01 14:22:34 crc kubenswrapper[4913]: I1001 14:22:34.703970 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wvf2r" podUID="0c370516-b043-4b97-a51e-56bd46f86846" containerName="registry-server" containerID="cri-o://3e83228578ac061dc698f389a57d9d930684bb63ded5239119a89dd26b6ee593" gracePeriod=2 Oct 01 14:22:34 crc kubenswrapper[4913]: I1001 14:22:34.785104 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76d9340e-b465-4b53-a53c-35b198b7c794-host\") pod \"crc-debug-qbd6b\" (UID: \"76d9340e-b465-4b53-a53c-35b198b7c794\") " pod="openshift-must-gather-fzk86/crc-debug-qbd6b" Oct 01 14:22:34 crc kubenswrapper[4913]: I1001 14:22:34.785442 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-999qn\" (UniqueName: \"kubernetes.io/projected/76d9340e-b465-4b53-a53c-35b198b7c794-kube-api-access-999qn\") pod \"crc-debug-qbd6b\" (UID: \"76d9340e-b465-4b53-a53c-35b198b7c794\") " pod="openshift-must-gather-fzk86/crc-debug-qbd6b" Oct 01 14:22:34 crc kubenswrapper[4913]: I1001 14:22:34.886994 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-999qn\" (UniqueName: \"kubernetes.io/projected/76d9340e-b465-4b53-a53c-35b198b7c794-kube-api-access-999qn\") pod \"crc-debug-qbd6b\" (UID: \"76d9340e-b465-4b53-a53c-35b198b7c794\") " pod="openshift-must-gather-fzk86/crc-debug-qbd6b" Oct 01 14:22:34 crc kubenswrapper[4913]: I1001 14:22:34.887633 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76d9340e-b465-4b53-a53c-35b198b7c794-host\") pod \"crc-debug-qbd6b\" (UID: \"76d9340e-b465-4b53-a53c-35b198b7c794\") " pod="openshift-must-gather-fzk86/crc-debug-qbd6b" Oct 01 14:22:34 crc kubenswrapper[4913]: I1001 14:22:34.887691 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76d9340e-b465-4b53-a53c-35b198b7c794-host\") pod \"crc-debug-qbd6b\" (UID: \"76d9340e-b465-4b53-a53c-35b198b7c794\") " pod="openshift-must-gather-fzk86/crc-debug-qbd6b" Oct 01 14:22:34 crc kubenswrapper[4913]: I1001 14:22:34.925798 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-999qn\" (UniqueName: \"kubernetes.io/projected/76d9340e-b465-4b53-a53c-35b198b7c794-kube-api-access-999qn\") pod \"crc-debug-qbd6b\" (UID: \"76d9340e-b465-4b53-a53c-35b198b7c794\") " pod="openshift-must-gather-fzk86/crc-debug-qbd6b" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.021923 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/crc-debug-qbd6b" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.104763 4913 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.180665 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.297774 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c370516-b043-4b97-a51e-56bd46f86846-utilities\") pod \"0c370516-b043-4b97-a51e-56bd46f86846\" (UID: \"0c370516-b043-4b97-a51e-56bd46f86846\") " Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.298069 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c370516-b043-4b97-a51e-56bd46f86846-catalog-content\") pod \"0c370516-b043-4b97-a51e-56bd46f86846\" (UID: \"0c370516-b043-4b97-a51e-56bd46f86846\") " Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.298253 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhshf\" (UniqueName: \"kubernetes.io/projected/0c370516-b043-4b97-a51e-56bd46f86846-kube-api-access-zhshf\") pod \"0c370516-b043-4b97-a51e-56bd46f86846\" (UID: \"0c370516-b043-4b97-a51e-56bd46f86846\") " Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.299191 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c370516-b043-4b97-a51e-56bd46f86846-utilities" (OuterVolumeSpecName: "utilities") pod "0c370516-b043-4b97-a51e-56bd46f86846" (UID: "0c370516-b043-4b97-a51e-56bd46f86846"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.301630 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c370516-b043-4b97-a51e-56bd46f86846-kube-api-access-zhshf" (OuterVolumeSpecName: "kube-api-access-zhshf") pod "0c370516-b043-4b97-a51e-56bd46f86846" (UID: "0c370516-b043-4b97-a51e-56bd46f86846"). InnerVolumeSpecName "kube-api-access-zhshf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.373728 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c370516-b043-4b97-a51e-56bd46f86846-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c370516-b043-4b97-a51e-56bd46f86846" (UID: "0c370516-b043-4b97-a51e-56bd46f86846"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.400243 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhshf\" (UniqueName: \"kubernetes.io/projected/0c370516-b043-4b97-a51e-56bd46f86846-kube-api-access-zhshf\") on node \"crc\" DevicePath \"\"" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.400485 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c370516-b043-4b97-a51e-56bd46f86846-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.400583 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c370516-b043-4b97-a51e-56bd46f86846-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.719957 4913 generic.go:334] "Generic (PLEG): container finished" podID="0c370516-b043-4b97-a51e-56bd46f86846" containerID="3e83228578ac061dc698f389a57d9d930684bb63ded5239119a89dd26b6ee593" exitCode=0 Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.719989 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvf2r" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.720042 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvf2r" event={"ID":"0c370516-b043-4b97-a51e-56bd46f86846","Type":"ContainerDied","Data":"3e83228578ac061dc698f389a57d9d930684bb63ded5239119a89dd26b6ee593"} Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.720096 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvf2r" event={"ID":"0c370516-b043-4b97-a51e-56bd46f86846","Type":"ContainerDied","Data":"460105255cb5e4e469a8f841854accfb34398dbfa8f389e5109fe72b5499a31b"} Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.720146 4913 scope.go:117] "RemoveContainer" containerID="3e83228578ac061dc698f389a57d9d930684bb63ded5239119a89dd26b6ee593" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.733110 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fzk86/crc-debug-qbd6b" event={"ID":"76d9340e-b465-4b53-a53c-35b198b7c794","Type":"ContainerStarted","Data":"7e2ecddd571026c7ec592a281a078c422a5658e5809ac25fd5e6e349f6b7cae2"} Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.758091 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvf2r"] Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.768718 4913 scope.go:117] "RemoveContainer" containerID="8216c93f51a76098661dd7b60171cca92e92e4849cd3c298001e3fa2a576cb75" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.769805 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wvf2r"] Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.796459 4913 scope.go:117] "RemoveContainer" containerID="d897da92e8c569161c4b12aed68e33f1de2c6ab2c3012d76ccac6d9d68d739e6" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.822256 4913 scope.go:117] "RemoveContainer" containerID="3e83228578ac061dc698f389a57d9d930684bb63ded5239119a89dd26b6ee593" Oct 01 14:22:35 crc kubenswrapper[4913]: E1001 14:22:35.830485 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e83228578ac061dc698f389a57d9d930684bb63ded5239119a89dd26b6ee593\": container with ID starting with 3e83228578ac061dc698f389a57d9d930684bb63ded5239119a89dd26b6ee593 not found: ID does not exist" containerID="3e83228578ac061dc698f389a57d9d930684bb63ded5239119a89dd26b6ee593" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.830543 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e83228578ac061dc698f389a57d9d930684bb63ded5239119a89dd26b6ee593"} err="failed to get container status \"3e83228578ac061dc698f389a57d9d930684bb63ded5239119a89dd26b6ee593\": rpc error: code = NotFound desc = could not find container \"3e83228578ac061dc698f389a57d9d930684bb63ded5239119a89dd26b6ee593\": container with ID starting with 3e83228578ac061dc698f389a57d9d930684bb63ded5239119a89dd26b6ee593 not found: ID does not exist" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.830576 4913 scope.go:117] "RemoveContainer" containerID="8216c93f51a76098661dd7b60171cca92e92e4849cd3c298001e3fa2a576cb75" Oct 01 14:22:35 crc kubenswrapper[4913]: E1001 14:22:35.834458 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8216c93f51a76098661dd7b60171cca92e92e4849cd3c298001e3fa2a576cb75\": container with ID starting with 8216c93f51a76098661dd7b60171cca92e92e4849cd3c298001e3fa2a576cb75 not found: ID does not exist" containerID="8216c93f51a76098661dd7b60171cca92e92e4849cd3c298001e3fa2a576cb75" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.834508 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8216c93f51a76098661dd7b60171cca92e92e4849cd3c298001e3fa2a576cb75"} err="failed to get container status \"8216c93f51a76098661dd7b60171cca92e92e4849cd3c298001e3fa2a576cb75\": rpc error: code = NotFound desc = could not find container \"8216c93f51a76098661dd7b60171cca92e92e4849cd3c298001e3fa2a576cb75\": container with ID starting with 8216c93f51a76098661dd7b60171cca92e92e4849cd3c298001e3fa2a576cb75 not found: ID does not exist" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.834541 4913 scope.go:117] "RemoveContainer" containerID="d897da92e8c569161c4b12aed68e33f1de2c6ab2c3012d76ccac6d9d68d739e6" Oct 01 14:22:35 crc kubenswrapper[4913]: E1001 14:22:35.834885 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d897da92e8c569161c4b12aed68e33f1de2c6ab2c3012d76ccac6d9d68d739e6\": container with ID starting with d897da92e8c569161c4b12aed68e33f1de2c6ab2c3012d76ccac6d9d68d739e6 not found: ID does not exist" containerID="d897da92e8c569161c4b12aed68e33f1de2c6ab2c3012d76ccac6d9d68d739e6" Oct 01 14:22:35 crc kubenswrapper[4913]: I1001 14:22:35.834915 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d897da92e8c569161c4b12aed68e33f1de2c6ab2c3012d76ccac6d9d68d739e6"} err="failed to get container status \"d897da92e8c569161c4b12aed68e33f1de2c6ab2c3012d76ccac6d9d68d739e6\": rpc error: code = NotFound desc = could not find container \"d897da92e8c569161c4b12aed68e33f1de2c6ab2c3012d76ccac6d9d68d739e6\": container with ID starting with d897da92e8c569161c4b12aed68e33f1de2c6ab2c3012d76ccac6d9d68d739e6 not found: ID does not exist" Oct 01 14:22:36 crc kubenswrapper[4913]: I1001 14:22:36.820898 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c370516-b043-4b97-a51e-56bd46f86846" path="/var/lib/kubelet/pods/0c370516-b043-4b97-a51e-56bd46f86846/volumes" Oct 01 14:22:40 crc kubenswrapper[4913]: I1001 14:22:40.083309 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:22:40 crc kubenswrapper[4913]: I1001 14:22:40.083916 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:22:45 crc kubenswrapper[4913]: I1001 14:22:45.852870 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fzk86/crc-debug-qbd6b" event={"ID":"76d9340e-b465-4b53-a53c-35b198b7c794","Type":"ContainerStarted","Data":"29afacbd7cc4b8bd8afffbae54b6be20fdbdc65fa7889c68178721d41f39c2a9"} Oct 01 14:22:45 crc kubenswrapper[4913]: I1001 14:22:45.874954 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fzk86/crc-debug-qbd6b" podStartSLOduration=1.48714767 podStartE2EDuration="11.874934814s" podCreationTimestamp="2025-10-01 14:22:34 +0000 UTC" firstStartedPulling="2025-10-01 14:22:35.10451091 +0000 UTC m=+6287.007986478" lastFinishedPulling="2025-10-01 14:22:45.492298044 +0000 UTC m=+6297.395773622" observedRunningTime="2025-10-01 14:22:45.866445009 +0000 UTC m=+6297.769920607" watchObservedRunningTime="2025-10-01 14:22:45.874934814 +0000 UTC m=+6297.778410392" Oct 01 14:23:10 crc kubenswrapper[4913]: I1001 14:23:10.083480 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:23:10 crc kubenswrapper[4913]: I1001 14:23:10.085106 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:23:10 crc kubenswrapper[4913]: I1001 14:23:10.085259 4913 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" Oct 01 14:23:10 crc kubenswrapper[4913]: I1001 14:23:10.086131 4913 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6"} pod="openshift-machine-config-operator/machine-config-daemon-8hltg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:23:10 crc kubenswrapper[4913]: I1001 14:23:10.086326 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" containerID="cri-o://be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" gracePeriod=600 Oct 01 14:23:10 crc kubenswrapper[4913]: E1001 14:23:10.219583 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:23:11 crc kubenswrapper[4913]: I1001 14:23:11.058666 4913 generic.go:334] "Generic (PLEG): container finished" podID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" exitCode=0 Oct 01 14:23:11 crc kubenswrapper[4913]: I1001 14:23:11.058764 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerDied","Data":"be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6"} Oct 01 14:23:11 crc kubenswrapper[4913]: I1001 14:23:11.059090 4913 scope.go:117] "RemoveContainer" containerID="504a2ad5746fe3c913dede622c0d007e14a82d5305df77d85d6a4fe2920436de" Oct 01 14:23:11 crc kubenswrapper[4913]: I1001 14:23:11.060175 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:23:11 crc kubenswrapper[4913]: E1001 14:23:11.060580 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:23:22 crc kubenswrapper[4913]: I1001 14:23:22.810993 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:23:22 crc kubenswrapper[4913]: E1001 14:23:22.812017 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:23:27 crc kubenswrapper[4913]: E1001 14:23:27.806858 4913 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 01 14:23:34 crc kubenswrapper[4913]: I1001 14:23:34.811959 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:23:34 crc kubenswrapper[4913]: E1001 14:23:34.812807 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:23:38 crc kubenswrapper[4913]: I1001 14:23:38.161643 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ansibletest-ansibletest_5dc4653a-10fc-461a-a058-44cb58eb7847/ansibletest-ansibletest/0.log" Oct 01 14:23:38 crc kubenswrapper[4913]: I1001 14:23:38.406040 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55679b7754-wczpf_42099d8f-bc53-4134-9351-cbbc06da162e/barbican-api/0.log" Oct 01 14:23:38 crc kubenswrapper[4913]: I1001 14:23:38.544533 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55679b7754-wczpf_42099d8f-bc53-4134-9351-cbbc06da162e/barbican-api-log/0.log" Oct 01 14:23:38 crc kubenswrapper[4913]: I1001 14:23:38.711447 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-584b46787d-q4kkd_06b8d5ee-e00b-4c23-8fbc-c817160bac72/barbican-keystone-listener/0.log" Oct 01 14:23:39 crc kubenswrapper[4913]: I1001 14:23:39.139238 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-dd4859979-qdfbk_d7906ec7-7151-42d3-a66f-8f269a3bf03f/barbican-worker/0.log" Oct 01 14:23:39 crc kubenswrapper[4913]: I1001 14:23:39.335030 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-584b46787d-q4kkd_06b8d5ee-e00b-4c23-8fbc-c817160bac72/barbican-keystone-listener-log/0.log" Oct 01 14:23:39 crc kubenswrapper[4913]: I1001 14:23:39.374133 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-dd4859979-qdfbk_d7906ec7-7151-42d3-a66f-8f269a3bf03f/barbican-worker-log/0.log" Oct 01 14:23:39 crc kubenswrapper[4913]: I1001 14:23:39.594222 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-qmrvg_615e9e8a-5d4a-410f-9fa9-5e1acfd7df02/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:39 crc kubenswrapper[4913]: I1001 14:23:39.791849 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ffdfbdb5-1891-4b89-a07a-d468ac3c7155/ceilometer-central-agent/0.log" Oct 01 14:23:39 crc kubenswrapper[4913]: I1001 14:23:39.958071 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ffdfbdb5-1891-4b89-a07a-d468ac3c7155/ceilometer-notification-agent/0.log" Oct 01 14:23:40 crc kubenswrapper[4913]: I1001 14:23:40.011164 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ffdfbdb5-1891-4b89-a07a-d468ac3c7155/proxy-httpd/0.log" Oct 01 14:23:40 crc kubenswrapper[4913]: I1001 14:23:40.159874 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ffdfbdb5-1891-4b89-a07a-d468ac3c7155/sg-core/0.log" Oct 01 14:23:40 crc kubenswrapper[4913]: I1001 14:23:40.358663 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-cvnkt_0ee105ca-a1e0-4566-bba7-bba5eca729f0/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:40 crc kubenswrapper[4913]: I1001 14:23:40.547286 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rh9qn_58cec94b-852c-4959-a24b-04d7f83fc246/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:40 crc kubenswrapper[4913]: I1001 14:23:40.753533 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c58551fb-1f57-46d7-9209-dc92b0ebb305/cinder-api-log/0.log" Oct 01 14:23:40 crc kubenswrapper[4913]: I1001 14:23:40.828417 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c58551fb-1f57-46d7-9209-dc92b0ebb305/cinder-api/0.log" Oct 01 14:23:41 crc kubenswrapper[4913]: I1001 14:23:41.057695 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed/probe/0.log" Oct 01 14:23:41 crc kubenswrapper[4913]: I1001 14:23:41.086714 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0d0ca0c4-37f7-43e3-8a6f-b9c5c42d02ed/cinder-backup/0.log" Oct 01 14:23:41 crc kubenswrapper[4913]: I1001 14:23:41.283826 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3708ac45-f021-44ed-8c85-e34c2ed73241/cinder-scheduler/0.log" Oct 01 14:23:41 crc kubenswrapper[4913]: I1001 14:23:41.352386 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3708ac45-f021-44ed-8c85-e34c2ed73241/probe/0.log" Oct 01 14:23:41 crc kubenswrapper[4913]: I1001 14:23:41.515090 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_f8627d36-1d7b-40fa-b011-7a1dacddb61c/cinder-volume/0.log" Oct 01 14:23:41 crc kubenswrapper[4913]: I1001 14:23:41.562615 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_f8627d36-1d7b-40fa-b011-7a1dacddb61c/probe/0.log" Oct 01 14:23:41 crc kubenswrapper[4913]: I1001 14:23:41.695565 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hf5j6_e9bedba1-5763-41ea-adc6-f0549d30df4d/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:41 crc kubenswrapper[4913]: I1001 14:23:41.805705 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mnmxz_d5ce8f21-34e2-4e8e-ace0-59c635454fcc/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:41 crc kubenswrapper[4913]: I1001 14:23:41.978691 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fc75556d9-559t4_0908cda5-711c-4449-90fb-2fd7f524b0db/init/0.log" Oct 01 14:23:42 crc kubenswrapper[4913]: I1001 14:23:42.143862 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fc75556d9-559t4_0908cda5-711c-4449-90fb-2fd7f524b0db/init/0.log" Oct 01 14:23:42 crc kubenswrapper[4913]: I1001 14:23:42.191378 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fc75556d9-559t4_0908cda5-711c-4449-90fb-2fd7f524b0db/dnsmasq-dns/0.log" Oct 01 14:23:42 crc kubenswrapper[4913]: I1001 14:23:42.373482 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ed64cca4-22fc-4756-863f-8cee18a7f40e/glance-httpd/0.log" Oct 01 14:23:42 crc kubenswrapper[4913]: I1001 14:23:42.381241 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ed64cca4-22fc-4756-863f-8cee18a7f40e/glance-log/0.log" Oct 01 14:23:42 crc kubenswrapper[4913]: I1001 14:23:42.546673 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e6353b13-0958-4f48-ac1b-0a2aeef50ad8/glance-log/0.log" Oct 01 14:23:42 crc kubenswrapper[4913]: I1001 14:23:42.606016 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e6353b13-0958-4f48-ac1b-0a2aeef50ad8/glance-httpd/0.log" Oct 01 14:23:42 crc kubenswrapper[4913]: I1001 14:23:42.851434 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-969db9cf8-b2hmw_e67e15e0-4c9f-492c-b38c-7955b5830285/horizon/0.log" Oct 01 14:23:43 crc kubenswrapper[4913]: I1001 14:23:43.060172 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizontest-tests-horizontest_3d11e9df-3564-44cc-a29f-9d6b3f043853/horizontest-tests-horizontest/0.log" Oct 01 14:23:43 crc kubenswrapper[4913]: I1001 14:23:43.255743 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-sd7xg_f4b6596a-2679-4c64-99f5-a966e8a3deef/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:43 crc kubenswrapper[4913]: I1001 14:23:43.450804 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-m58ft_daa71a8f-cb1b-4e0a-afaa-906bc0408723/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:43 crc kubenswrapper[4913]: I1001 14:23:43.909219 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-969db9cf8-b2hmw_e67e15e0-4c9f-492c-b38c-7955b5830285/horizon-log/0.log" Oct 01 14:23:44 crc kubenswrapper[4913]: I1001 14:23:44.111500 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29322061-r8fks_de7f94d4-57c9-4d92-8b60-678373217f05/keystone-cron/0.log" Oct 01 14:23:44 crc kubenswrapper[4913]: I1001 14:23:44.377695 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29322121-gg2t8_f06702e3-f9e7-4e79-99d7-24b3203a1051/keystone-cron/0.log" Oct 01 14:23:44 crc kubenswrapper[4913]: I1001 14:23:44.584074 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_b0093d5f-95cb-4a40-877b-01ddb11c929b/kube-state-metrics/0.log" Oct 01 14:23:44 crc kubenswrapper[4913]: I1001 14:23:44.703352 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7d8f89cd7f-sqjg8_a29fe08d-d79a-48ba-b8b8-67eda446e3c6/keystone-api/0.log" Oct 01 14:23:44 crc kubenswrapper[4913]: I1001 14:23:44.817639 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-znc8m_5248ec5f-6231-40a3-be9d-815bdf5ec259/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:45 crc kubenswrapper[4913]: I1001 14:23:45.020509 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_cb953036-cada-47b8-8f60-6a7df072c7e2/manila-api/0.log" Oct 01 14:23:45 crc kubenswrapper[4913]: I1001 14:23:45.043905 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_cb953036-cada-47b8-8f60-6a7df072c7e2/manila-api-log/0.log" Oct 01 14:23:45 crc kubenswrapper[4913]: I1001 14:23:45.237335 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_350935fd-0db6-4b14-b035-60bd31f6ea57/probe/0.log" Oct 01 14:23:45 crc kubenswrapper[4913]: I1001 14:23:45.278564 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_350935fd-0db6-4b14-b035-60bd31f6ea57/manila-scheduler/0.log" Oct 01 14:23:45 crc kubenswrapper[4913]: I1001 14:23:45.468132 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_fa45eb41-34ff-42cb-97f8-71004a7e500f/manila-share/0.log" Oct 01 14:23:45 crc kubenswrapper[4913]: I1001 14:23:45.482999 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_fa45eb41-34ff-42cb-97f8-71004a7e500f/probe/0.log" Oct 01 14:23:46 crc kubenswrapper[4913]: I1001 14:23:46.288859 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6854dd75d7-6cgpn_6278eaca-e01e-4eb1-9c7f-e12fc399606a/neutron-httpd/0.log" Oct 01 14:23:46 crc kubenswrapper[4913]: I1001 14:23:46.662550 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6854dd75d7-6cgpn_6278eaca-e01e-4eb1-9c7f-e12fc399606a/neutron-api/0.log" Oct 01 14:23:46 crc kubenswrapper[4913]: I1001 14:23:46.806382 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:23:46 crc kubenswrapper[4913]: E1001 14:23:46.806753 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:23:46 crc kubenswrapper[4913]: I1001 14:23:46.820095 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-gvs5z_e92e49b3-b2d4-41f0-9933-ea43cc692f5a/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:48 crc kubenswrapper[4913]: I1001 14:23:48.009824 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5c193299-cf4e-4cd3-8ec0-6bba16872aa6/nova-api-log/0.log" Oct 01 14:23:48 crc kubenswrapper[4913]: I1001 14:23:48.756631 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5c193299-cf4e-4cd3-8ec0-6bba16872aa6/nova-api-api/0.log" Oct 01 14:23:48 crc kubenswrapper[4913]: I1001 14:23:48.772155 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ca9902fd-1820-476e-997b-78f8f80c9d10/nova-cell0-conductor-conductor/0.log" Oct 01 14:23:49 crc kubenswrapper[4913]: I1001 14:23:49.071795 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_bde6493b-0af5-4f13-9766-fde504bf6abc/nova-cell1-conductor-conductor/0.log" Oct 01 14:23:49 crc kubenswrapper[4913]: I1001 14:23:49.376089 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2c8c0d2b-3313-4919-9477-42d93dd1dfdc/nova-cell1-novncproxy-novncproxy/0.log" Oct 01 14:23:49 crc kubenswrapper[4913]: I1001 14:23:49.630004 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-frzhd_669a8e92-9f6b-4ae7-9647-10c9e96d2de4/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:49 crc kubenswrapper[4913]: I1001 14:23:49.954870 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bd947f3e-094f-4a5e-ac65-f24ae595ffdb/nova-metadata-log/0.log" Oct 01 14:23:50 crc kubenswrapper[4913]: I1001 14:23:50.932928 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_fb00793e-8b71-47c4-8bce-1197d68a8b4b/nova-scheduler-scheduler/0.log" Oct 01 14:23:51 crc kubenswrapper[4913]: I1001 14:23:51.404680 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fe74cb0a-552b-42dd-a1af-acb58e98b7dd/mysql-bootstrap/0.log" Oct 01 14:23:51 crc kubenswrapper[4913]: I1001 14:23:51.661336 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fe74cb0a-552b-42dd-a1af-acb58e98b7dd/mysql-bootstrap/0.log" Oct 01 14:23:51 crc kubenswrapper[4913]: I1001 14:23:51.868903 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fe74cb0a-552b-42dd-a1af-acb58e98b7dd/galera/0.log" Oct 01 14:23:52 crc kubenswrapper[4913]: I1001 14:23:52.474204 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f9cea593-9b1f-44f0-8f3a-831b4b0ee98d/mysql-bootstrap/0.log" Oct 01 14:23:52 crc kubenswrapper[4913]: I1001 14:23:52.653635 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f9cea593-9b1f-44f0-8f3a-831b4b0ee98d/mysql-bootstrap/0.log" Oct 01 14:23:52 crc kubenswrapper[4913]: I1001 14:23:52.733380 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f9cea593-9b1f-44f0-8f3a-831b4b0ee98d/galera/0.log" Oct 01 14:23:52 crc kubenswrapper[4913]: I1001 14:23:52.782872 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bd947f3e-094f-4a5e-ac65-f24ae595ffdb/nova-metadata-metadata/0.log" Oct 01 14:23:53 crc kubenswrapper[4913]: I1001 14:23:53.125427 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_fe4614c3-9118-41ab-be00-667f0bbca6bb/openstackclient/0.log" Oct 01 14:23:53 crc kubenswrapper[4913]: I1001 14:23:53.283973 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-k7dpl_59c8670e-1109-40cd-a637-636023bbd6d5/openstack-network-exporter/0.log" Oct 01 14:23:53 crc kubenswrapper[4913]: I1001 14:23:53.490628 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7ck28_da2fc463-557f-4a82-bd58-60b0e08930a4/ovsdb-server-init/0.log" Oct 01 14:23:53 crc kubenswrapper[4913]: I1001 14:23:53.679176 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7ck28_da2fc463-557f-4a82-bd58-60b0e08930a4/ovsdb-server-init/0.log" Oct 01 14:23:53 crc kubenswrapper[4913]: I1001 14:23:53.701208 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7ck28_da2fc463-557f-4a82-bd58-60b0e08930a4/ovsdb-server/0.log" Oct 01 14:23:53 crc kubenswrapper[4913]: I1001 14:23:53.740161 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7ck28_da2fc463-557f-4a82-bd58-60b0e08930a4/ovs-vswitchd/0.log" Oct 01 14:23:53 crc kubenswrapper[4913]: I1001 14:23:53.897936 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vjkgr_e90d4e3a-3c02-4d5d-84f4-32d5cb411f77/ovn-controller/0.log" Oct 01 14:23:54 crc kubenswrapper[4913]: I1001 14:23:54.106588 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-kpl2v_910bdd9f-9b3e-43a2-af16-1c392e80d9ed/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:54 crc kubenswrapper[4913]: I1001 14:23:54.304299 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4ed4667c-0b5b-4e01-b482-4ecb3caebbad/ovn-northd/0.log" Oct 01 14:23:54 crc kubenswrapper[4913]: I1001 14:23:54.377916 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4ed4667c-0b5b-4e01-b482-4ecb3caebbad/openstack-network-exporter/0.log" Oct 01 14:23:54 crc kubenswrapper[4913]: I1001 14:23:54.515556 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_71b728de-af3e-4c33-baad-98a46852c91f/openstack-network-exporter/0.log" Oct 01 14:23:54 crc kubenswrapper[4913]: I1001 14:23:54.590688 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_71b728de-af3e-4c33-baad-98a46852c91f/ovsdbserver-nb/0.log" Oct 01 14:23:54 crc kubenswrapper[4913]: I1001 14:23:54.761351 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5f930acb-c2e8-4c5f-8d1b-acdc15375467/openstack-network-exporter/0.log" Oct 01 14:23:54 crc kubenswrapper[4913]: I1001 14:23:54.827257 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5f930acb-c2e8-4c5f-8d1b-acdc15375467/ovsdbserver-sb/0.log" Oct 01 14:23:55 crc kubenswrapper[4913]: I1001 14:23:55.222571 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-66d867dfb6-r9zrq_11336389-1acf-4342-b478-e11f04e7848d/placement-api/0.log" Oct 01 14:23:55 crc kubenswrapper[4913]: I1001 14:23:55.429923 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-66d867dfb6-r9zrq_11336389-1acf-4342-b478-e11f04e7848d/placement-log/0.log" Oct 01 14:23:55 crc kubenswrapper[4913]: I1001 14:23:55.457421 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7d60c488-fc39-4b3b-bd78-839f6975bcfa/setup-container/0.log" Oct 01 14:23:55 crc kubenswrapper[4913]: I1001 14:23:55.682783 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7d60c488-fc39-4b3b-bd78-839f6975bcfa/setup-container/0.log" Oct 01 14:23:55 crc kubenswrapper[4913]: I1001 14:23:55.734530 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_7d60c488-fc39-4b3b-bd78-839f6975bcfa/rabbitmq/0.log" Oct 01 14:23:55 crc kubenswrapper[4913]: I1001 14:23:55.922678 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3e71ec0f-d0d7-40a1-b83c-20f0dc177473/setup-container/0.log" Oct 01 14:23:56 crc kubenswrapper[4913]: I1001 14:23:56.212772 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3e71ec0f-d0d7-40a1-b83c-20f0dc177473/setup-container/0.log" Oct 01 14:23:56 crc kubenswrapper[4913]: I1001 14:23:56.221670 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3e71ec0f-d0d7-40a1-b83c-20f0dc177473/rabbitmq/0.log" Oct 01 14:23:56 crc kubenswrapper[4913]: I1001 14:23:56.437963 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xjx52_a6dcbc19-3406-42c7-bc2c-8b17e750a3cd/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:56 crc kubenswrapper[4913]: I1001 14:23:56.536348 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-j2p7l_5a28de19-ceb2-4b36-ae51-1b69d134b6fd/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:56 crc kubenswrapper[4913]: I1001 14:23:56.720081 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rpp5r_629d4b85-c10c-45cc-ad03-4954acadaf15/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:56 crc kubenswrapper[4913]: I1001 14:23:56.987859 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-2j2qp_2218b097-a775-4dca-88c6-d7676fc4ef97/ssh-known-hosts-edpm-deployment/0.log" Oct 01 14:23:57 crc kubenswrapper[4913]: I1001 14:23:57.463485 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s01-single-test_3a042f5f-1b7b-42fa-b8b4-db936848fbe9/tempest-tests-tempest-tests-runner/0.log" Oct 01 14:23:57 crc kubenswrapper[4913]: I1001 14:23:57.643922 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-ansibletest-ansibletest-ansibletest_96a792f4-42c4-4e02-8ac3-e49612ad6a30/test-operator-logs-container/0.log" Oct 01 14:23:57 crc kubenswrapper[4913]: I1001 14:23:57.842945 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-horizontest-horizontest-tests-horizontest_d933ac8f-d113-4120-8848-65d6c2affdac/test-operator-logs-container/0.log" Oct 01 14:23:58 crc kubenswrapper[4913]: I1001 14:23:58.104950 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_8a0f7457-8032-4fb8-8784-126cad8b13a8/test-operator-logs-container/0.log" Oct 01 14:23:58 crc kubenswrapper[4913]: I1001 14:23:58.282721 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s00-full_3ef07e68-9334-463a-888f-1fd9fe3d3f1c/tempest-tests-tempest-tests-runner/0.log" Oct 01 14:23:58 crc kubenswrapper[4913]: I1001 14:23:58.287186 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tobiko-tobiko-tests-tobiko_aa3d2c62-1c98-4879-bb9f-c9d48d6ad57a/test-operator-logs-container/0.log" Oct 01 14:23:58 crc kubenswrapper[4913]: I1001 14:23:58.492483 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s00-podified-functional_567cfe78-d6e0-4a12-99d7-a280aeb55e68/tobiko-tests-tobiko/0.log" Oct 01 14:23:58 crc kubenswrapper[4913]: I1001 14:23:58.821943 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s01-sanity_d2b76a33-156d-4aa9-9ee3-c17ac4bc753d/tobiko-tests-tobiko/0.log" Oct 01 14:23:58 crc kubenswrapper[4913]: I1001 14:23:58.891566 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-q4g27_01e03618-1c33-4ee7-8b54-c07fef3946e2/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:23:59 crc kubenswrapper[4913]: I1001 14:23:59.248341 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_0cfc30d4-ee8f-4491-a558-cb067b0dec39/memcached/0.log" Oct 01 14:24:01 crc kubenswrapper[4913]: I1001 14:24:01.807349 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:24:01 crc kubenswrapper[4913]: E1001 14:24:01.808026 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:24:12 crc kubenswrapper[4913]: I1001 14:24:12.806756 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:24:12 crc kubenswrapper[4913]: E1001 14:24:12.807496 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:24:27 crc kubenswrapper[4913]: I1001 14:24:27.806948 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:24:27 crc kubenswrapper[4913]: E1001 14:24:27.807748 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:24:33 crc kubenswrapper[4913]: I1001 14:24:33.954121 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p2j57"] Oct 01 14:24:33 crc kubenswrapper[4913]: E1001 14:24:33.955118 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c370516-b043-4b97-a51e-56bd46f86846" containerName="registry-server" Oct 01 14:24:33 crc kubenswrapper[4913]: I1001 14:24:33.955139 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c370516-b043-4b97-a51e-56bd46f86846" containerName="registry-server" Oct 01 14:24:33 crc kubenswrapper[4913]: E1001 14:24:33.955183 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c370516-b043-4b97-a51e-56bd46f86846" containerName="extract-utilities" Oct 01 14:24:33 crc kubenswrapper[4913]: I1001 14:24:33.955196 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c370516-b043-4b97-a51e-56bd46f86846" containerName="extract-utilities" Oct 01 14:24:33 crc kubenswrapper[4913]: E1001 14:24:33.955214 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c370516-b043-4b97-a51e-56bd46f86846" containerName="extract-content" Oct 01 14:24:33 crc kubenswrapper[4913]: I1001 14:24:33.955225 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c370516-b043-4b97-a51e-56bd46f86846" containerName="extract-content" Oct 01 14:24:33 crc kubenswrapper[4913]: I1001 14:24:33.955565 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c370516-b043-4b97-a51e-56bd46f86846" containerName="registry-server" Oct 01 14:24:33 crc kubenswrapper[4913]: I1001 14:24:33.957903 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:33 crc kubenswrapper[4913]: I1001 14:24:33.968002 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p2j57"] Oct 01 14:24:34 crc kubenswrapper[4913]: I1001 14:24:34.049003 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbb97c1-7584-4943-ac18-6e88485d4a6f-utilities\") pod \"certified-operators-p2j57\" (UID: \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\") " pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:34 crc kubenswrapper[4913]: I1001 14:24:34.049183 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbb97c1-7584-4943-ac18-6e88485d4a6f-catalog-content\") pod \"certified-operators-p2j57\" (UID: \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\") " pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:34 crc kubenswrapper[4913]: I1001 14:24:34.049259 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhc2s\" (UniqueName: \"kubernetes.io/projected/ebbb97c1-7584-4943-ac18-6e88485d4a6f-kube-api-access-dhc2s\") pod \"certified-operators-p2j57\" (UID: \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\") " pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:34 crc kubenswrapper[4913]: I1001 14:24:34.150885 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbb97c1-7584-4943-ac18-6e88485d4a6f-utilities\") pod \"certified-operators-p2j57\" (UID: \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\") " pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:34 crc kubenswrapper[4913]: I1001 14:24:34.150997 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbb97c1-7584-4943-ac18-6e88485d4a6f-catalog-content\") pod \"certified-operators-p2j57\" (UID: \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\") " pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:34 crc kubenswrapper[4913]: I1001 14:24:34.151050 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhc2s\" (UniqueName: \"kubernetes.io/projected/ebbb97c1-7584-4943-ac18-6e88485d4a6f-kube-api-access-dhc2s\") pod \"certified-operators-p2j57\" (UID: \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\") " pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:34 crc kubenswrapper[4913]: I1001 14:24:34.151495 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbb97c1-7584-4943-ac18-6e88485d4a6f-utilities\") pod \"certified-operators-p2j57\" (UID: \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\") " pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:34 crc kubenswrapper[4913]: I1001 14:24:34.151648 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbb97c1-7584-4943-ac18-6e88485d4a6f-catalog-content\") pod \"certified-operators-p2j57\" (UID: \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\") " pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:34 crc kubenswrapper[4913]: I1001 14:24:34.185181 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhc2s\" (UniqueName: \"kubernetes.io/projected/ebbb97c1-7584-4943-ac18-6e88485d4a6f-kube-api-access-dhc2s\") pod \"certified-operators-p2j57\" (UID: \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\") " pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:34 crc kubenswrapper[4913]: I1001 14:24:34.324763 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:34 crc kubenswrapper[4913]: I1001 14:24:34.937341 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p2j57"] Oct 01 14:24:35 crc kubenswrapper[4913]: I1001 14:24:35.892192 4913 generic.go:334] "Generic (PLEG): container finished" podID="ebbb97c1-7584-4943-ac18-6e88485d4a6f" containerID="1725233dcf94b211ad44c504a939d785363123fff632128290202bd5791460e7" exitCode=0 Oct 01 14:24:35 crc kubenswrapper[4913]: I1001 14:24:35.892844 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p2j57" event={"ID":"ebbb97c1-7584-4943-ac18-6e88485d4a6f","Type":"ContainerDied","Data":"1725233dcf94b211ad44c504a939d785363123fff632128290202bd5791460e7"} Oct 01 14:24:35 crc kubenswrapper[4913]: I1001 14:24:35.892886 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p2j57" event={"ID":"ebbb97c1-7584-4943-ac18-6e88485d4a6f","Type":"ContainerStarted","Data":"f25e3afe72ffb2796c5515266463301aa080739412aa5577867717e22c435f52"} Oct 01 14:24:37 crc kubenswrapper[4913]: I1001 14:24:37.910586 4913 generic.go:334] "Generic (PLEG): container finished" podID="ebbb97c1-7584-4943-ac18-6e88485d4a6f" containerID="a626565195370aac7ea57bb1bb37022a1ead79717f9ae8249df13bb6e0bdcd60" exitCode=0 Oct 01 14:24:37 crc kubenswrapper[4913]: I1001 14:24:37.910702 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p2j57" event={"ID":"ebbb97c1-7584-4943-ac18-6e88485d4a6f","Type":"ContainerDied","Data":"a626565195370aac7ea57bb1bb37022a1ead79717f9ae8249df13bb6e0bdcd60"} Oct 01 14:24:38 crc kubenswrapper[4913]: E1001 14:24:38.813746 4913 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 01 14:24:38 crc kubenswrapper[4913]: I1001 14:24:38.922208 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p2j57" event={"ID":"ebbb97c1-7584-4943-ac18-6e88485d4a6f","Type":"ContainerStarted","Data":"1588037ea930028aef857723ccfe5b25a2c47e6600e15a53c2608914258a5513"} Oct 01 14:24:38 crc kubenswrapper[4913]: I1001 14:24:38.949347 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p2j57" podStartSLOduration=3.220267012 podStartE2EDuration="5.949330694s" podCreationTimestamp="2025-10-01 14:24:33 +0000 UTC" firstStartedPulling="2025-10-01 14:24:35.897187248 +0000 UTC m=+6407.800662826" lastFinishedPulling="2025-10-01 14:24:38.62625094 +0000 UTC m=+6410.529726508" observedRunningTime="2025-10-01 14:24:38.942370372 +0000 UTC m=+6410.845845960" watchObservedRunningTime="2025-10-01 14:24:38.949330694 +0000 UTC m=+6410.852806272" Oct 01 14:24:40 crc kubenswrapper[4913]: I1001 14:24:40.807920 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:24:40 crc kubenswrapper[4913]: E1001 14:24:40.808562 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:24:44 crc kubenswrapper[4913]: I1001 14:24:44.325716 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:44 crc kubenswrapper[4913]: I1001 14:24:44.326036 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:44 crc kubenswrapper[4913]: I1001 14:24:44.387453 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:45 crc kubenswrapper[4913]: I1001 14:24:45.020941 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:45 crc kubenswrapper[4913]: I1001 14:24:45.063129 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p2j57"] Oct 01 14:24:46 crc kubenswrapper[4913]: I1001 14:24:46.985729 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p2j57" podUID="ebbb97c1-7584-4943-ac18-6e88485d4a6f" containerName="registry-server" containerID="cri-o://1588037ea930028aef857723ccfe5b25a2c47e6600e15a53c2608914258a5513" gracePeriod=2 Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.527760 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.665185 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbb97c1-7584-4943-ac18-6e88485d4a6f-catalog-content\") pod \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\" (UID: \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\") " Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.665948 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhc2s\" (UniqueName: \"kubernetes.io/projected/ebbb97c1-7584-4943-ac18-6e88485d4a6f-kube-api-access-dhc2s\") pod \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\" (UID: \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\") " Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.666139 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbb97c1-7584-4943-ac18-6e88485d4a6f-utilities\") pod \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\" (UID: \"ebbb97c1-7584-4943-ac18-6e88485d4a6f\") " Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.666824 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebbb97c1-7584-4943-ac18-6e88485d4a6f-utilities" (OuterVolumeSpecName: "utilities") pod "ebbb97c1-7584-4943-ac18-6e88485d4a6f" (UID: "ebbb97c1-7584-4943-ac18-6e88485d4a6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.683259 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebbb97c1-7584-4943-ac18-6e88485d4a6f-kube-api-access-dhc2s" (OuterVolumeSpecName: "kube-api-access-dhc2s") pod "ebbb97c1-7584-4943-ac18-6e88485d4a6f" (UID: "ebbb97c1-7584-4943-ac18-6e88485d4a6f"). InnerVolumeSpecName "kube-api-access-dhc2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.768827 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhc2s\" (UniqueName: \"kubernetes.io/projected/ebbb97c1-7584-4943-ac18-6e88485d4a6f-kube-api-access-dhc2s\") on node \"crc\" DevicePath \"\"" Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.768864 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbb97c1-7584-4943-ac18-6e88485d4a6f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.778412 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebbb97c1-7584-4943-ac18-6e88485d4a6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebbb97c1-7584-4943-ac18-6e88485d4a6f" (UID: "ebbb97c1-7584-4943-ac18-6e88485d4a6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.870663 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbb97c1-7584-4943-ac18-6e88485d4a6f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.997006 4913 generic.go:334] "Generic (PLEG): container finished" podID="ebbb97c1-7584-4943-ac18-6e88485d4a6f" containerID="1588037ea930028aef857723ccfe5b25a2c47e6600e15a53c2608914258a5513" exitCode=0 Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.997056 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p2j57" event={"ID":"ebbb97c1-7584-4943-ac18-6e88485d4a6f","Type":"ContainerDied","Data":"1588037ea930028aef857723ccfe5b25a2c47e6600e15a53c2608914258a5513"} Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.997085 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p2j57" event={"ID":"ebbb97c1-7584-4943-ac18-6e88485d4a6f","Type":"ContainerDied","Data":"f25e3afe72ffb2796c5515266463301aa080739412aa5577867717e22c435f52"} Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.997103 4913 scope.go:117] "RemoveContainer" containerID="1588037ea930028aef857723ccfe5b25a2c47e6600e15a53c2608914258a5513" Oct 01 14:24:47 crc kubenswrapper[4913]: I1001 14:24:47.997247 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p2j57" Oct 01 14:24:48 crc kubenswrapper[4913]: I1001 14:24:48.024904 4913 scope.go:117] "RemoveContainer" containerID="a626565195370aac7ea57bb1bb37022a1ead79717f9ae8249df13bb6e0bdcd60" Oct 01 14:24:48 crc kubenswrapper[4913]: I1001 14:24:48.034621 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p2j57"] Oct 01 14:24:48 crc kubenswrapper[4913]: I1001 14:24:48.046579 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p2j57"] Oct 01 14:24:48 crc kubenswrapper[4913]: I1001 14:24:48.054234 4913 scope.go:117] "RemoveContainer" containerID="1725233dcf94b211ad44c504a939d785363123fff632128290202bd5791460e7" Oct 01 14:24:48 crc kubenswrapper[4913]: I1001 14:24:48.090639 4913 scope.go:117] "RemoveContainer" containerID="1588037ea930028aef857723ccfe5b25a2c47e6600e15a53c2608914258a5513" Oct 01 14:24:48 crc kubenswrapper[4913]: E1001 14:24:48.091090 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1588037ea930028aef857723ccfe5b25a2c47e6600e15a53c2608914258a5513\": container with ID starting with 1588037ea930028aef857723ccfe5b25a2c47e6600e15a53c2608914258a5513 not found: ID does not exist" containerID="1588037ea930028aef857723ccfe5b25a2c47e6600e15a53c2608914258a5513" Oct 01 14:24:48 crc kubenswrapper[4913]: I1001 14:24:48.091131 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1588037ea930028aef857723ccfe5b25a2c47e6600e15a53c2608914258a5513"} err="failed to get container status \"1588037ea930028aef857723ccfe5b25a2c47e6600e15a53c2608914258a5513\": rpc error: code = NotFound desc = could not find container \"1588037ea930028aef857723ccfe5b25a2c47e6600e15a53c2608914258a5513\": container with ID starting with 1588037ea930028aef857723ccfe5b25a2c47e6600e15a53c2608914258a5513 not found: ID does not exist" Oct 01 14:24:48 crc kubenswrapper[4913]: I1001 14:24:48.091159 4913 scope.go:117] "RemoveContainer" containerID="a626565195370aac7ea57bb1bb37022a1ead79717f9ae8249df13bb6e0bdcd60" Oct 01 14:24:48 crc kubenswrapper[4913]: E1001 14:24:48.091610 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a626565195370aac7ea57bb1bb37022a1ead79717f9ae8249df13bb6e0bdcd60\": container with ID starting with a626565195370aac7ea57bb1bb37022a1ead79717f9ae8249df13bb6e0bdcd60 not found: ID does not exist" containerID="a626565195370aac7ea57bb1bb37022a1ead79717f9ae8249df13bb6e0bdcd60" Oct 01 14:24:48 crc kubenswrapper[4913]: I1001 14:24:48.091652 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a626565195370aac7ea57bb1bb37022a1ead79717f9ae8249df13bb6e0bdcd60"} err="failed to get container status \"a626565195370aac7ea57bb1bb37022a1ead79717f9ae8249df13bb6e0bdcd60\": rpc error: code = NotFound desc = could not find container \"a626565195370aac7ea57bb1bb37022a1ead79717f9ae8249df13bb6e0bdcd60\": container with ID starting with a626565195370aac7ea57bb1bb37022a1ead79717f9ae8249df13bb6e0bdcd60 not found: ID does not exist" Oct 01 14:24:48 crc kubenswrapper[4913]: I1001 14:24:48.091666 4913 scope.go:117] "RemoveContainer" containerID="1725233dcf94b211ad44c504a939d785363123fff632128290202bd5791460e7" Oct 01 14:24:48 crc kubenswrapper[4913]: E1001 14:24:48.091977 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1725233dcf94b211ad44c504a939d785363123fff632128290202bd5791460e7\": container with ID starting with 1725233dcf94b211ad44c504a939d785363123fff632128290202bd5791460e7 not found: ID does not exist" containerID="1725233dcf94b211ad44c504a939d785363123fff632128290202bd5791460e7" Oct 01 14:24:48 crc kubenswrapper[4913]: I1001 14:24:48.092028 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1725233dcf94b211ad44c504a939d785363123fff632128290202bd5791460e7"} err="failed to get container status \"1725233dcf94b211ad44c504a939d785363123fff632128290202bd5791460e7\": rpc error: code = NotFound desc = could not find container \"1725233dcf94b211ad44c504a939d785363123fff632128290202bd5791460e7\": container with ID starting with 1725233dcf94b211ad44c504a939d785363123fff632128290202bd5791460e7 not found: ID does not exist" Oct 01 14:24:48 crc kubenswrapper[4913]: I1001 14:24:48.836350 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebbb97c1-7584-4943-ac18-6e88485d4a6f" path="/var/lib/kubelet/pods/ebbb97c1-7584-4943-ac18-6e88485d4a6f/volumes" Oct 01 14:24:51 crc kubenswrapper[4913]: I1001 14:24:51.807560 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:24:51 crc kubenswrapper[4913]: E1001 14:24:51.808562 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:24:56 crc kubenswrapper[4913]: I1001 14:24:56.086372 4913 generic.go:334] "Generic (PLEG): container finished" podID="76d9340e-b465-4b53-a53c-35b198b7c794" containerID="29afacbd7cc4b8bd8afffbae54b6be20fdbdc65fa7889c68178721d41f39c2a9" exitCode=0 Oct 01 14:24:56 crc kubenswrapper[4913]: I1001 14:24:56.086421 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fzk86/crc-debug-qbd6b" event={"ID":"76d9340e-b465-4b53-a53c-35b198b7c794","Type":"ContainerDied","Data":"29afacbd7cc4b8bd8afffbae54b6be20fdbdc65fa7889c68178721d41f39c2a9"} Oct 01 14:24:57 crc kubenswrapper[4913]: I1001 14:24:57.199423 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/crc-debug-qbd6b" Oct 01 14:24:57 crc kubenswrapper[4913]: I1001 14:24:57.229416 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fzk86/crc-debug-qbd6b"] Oct 01 14:24:57 crc kubenswrapper[4913]: I1001 14:24:57.248224 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fzk86/crc-debug-qbd6b"] Oct 01 14:24:57 crc kubenswrapper[4913]: I1001 14:24:57.361027 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76d9340e-b465-4b53-a53c-35b198b7c794-host\") pod \"76d9340e-b465-4b53-a53c-35b198b7c794\" (UID: \"76d9340e-b465-4b53-a53c-35b198b7c794\") " Oct 01 14:24:57 crc kubenswrapper[4913]: I1001 14:24:57.361127 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76d9340e-b465-4b53-a53c-35b198b7c794-host" (OuterVolumeSpecName: "host") pod "76d9340e-b465-4b53-a53c-35b198b7c794" (UID: "76d9340e-b465-4b53-a53c-35b198b7c794"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:24:57 crc kubenswrapper[4913]: I1001 14:24:57.361203 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-999qn\" (UniqueName: \"kubernetes.io/projected/76d9340e-b465-4b53-a53c-35b198b7c794-kube-api-access-999qn\") pod \"76d9340e-b465-4b53-a53c-35b198b7c794\" (UID: \"76d9340e-b465-4b53-a53c-35b198b7c794\") " Oct 01 14:24:57 crc kubenswrapper[4913]: I1001 14:24:57.361786 4913 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76d9340e-b465-4b53-a53c-35b198b7c794-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:24:57 crc kubenswrapper[4913]: I1001 14:24:57.367329 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d9340e-b465-4b53-a53c-35b198b7c794-kube-api-access-999qn" (OuterVolumeSpecName: "kube-api-access-999qn") pod "76d9340e-b465-4b53-a53c-35b198b7c794" (UID: "76d9340e-b465-4b53-a53c-35b198b7c794"). InnerVolumeSpecName "kube-api-access-999qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:24:57 crc kubenswrapper[4913]: I1001 14:24:57.463980 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-999qn\" (UniqueName: \"kubernetes.io/projected/76d9340e-b465-4b53-a53c-35b198b7c794-kube-api-access-999qn\") on node \"crc\" DevicePath \"\"" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.104109 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e2ecddd571026c7ec592a281a078c422a5658e5809ac25fd5e6e349f6b7cae2" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.104158 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/crc-debug-qbd6b" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.433689 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fzk86/crc-debug-tj2v6"] Oct 01 14:24:58 crc kubenswrapper[4913]: E1001 14:24:58.434210 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbb97c1-7584-4943-ac18-6e88485d4a6f" containerName="extract-content" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.434227 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbb97c1-7584-4943-ac18-6e88485d4a6f" containerName="extract-content" Oct 01 14:24:58 crc kubenswrapper[4913]: E1001 14:24:58.434238 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbb97c1-7584-4943-ac18-6e88485d4a6f" containerName="registry-server" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.434245 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbb97c1-7584-4943-ac18-6e88485d4a6f" containerName="registry-server" Oct 01 14:24:58 crc kubenswrapper[4913]: E1001 14:24:58.434263 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbb97c1-7584-4943-ac18-6e88485d4a6f" containerName="extract-utilities" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.434293 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbb97c1-7584-4943-ac18-6e88485d4a6f" containerName="extract-utilities" Oct 01 14:24:58 crc kubenswrapper[4913]: E1001 14:24:58.434316 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d9340e-b465-4b53-a53c-35b198b7c794" containerName="container-00" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.434323 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d9340e-b465-4b53-a53c-35b198b7c794" containerName="container-00" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.434580 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebbb97c1-7584-4943-ac18-6e88485d4a6f" containerName="registry-server" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.434603 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d9340e-b465-4b53-a53c-35b198b7c794" containerName="container-00" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.435417 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/crc-debug-tj2v6" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.585284 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm4m4\" (UniqueName: \"kubernetes.io/projected/f805e6f8-2cd4-406d-b67f-33a8f0f4e2be-kube-api-access-mm4m4\") pod \"crc-debug-tj2v6\" (UID: \"f805e6f8-2cd4-406d-b67f-33a8f0f4e2be\") " pod="openshift-must-gather-fzk86/crc-debug-tj2v6" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.586240 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f805e6f8-2cd4-406d-b67f-33a8f0f4e2be-host\") pod \"crc-debug-tj2v6\" (UID: \"f805e6f8-2cd4-406d-b67f-33a8f0f4e2be\") " pod="openshift-must-gather-fzk86/crc-debug-tj2v6" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.688444 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f805e6f8-2cd4-406d-b67f-33a8f0f4e2be-host\") pod \"crc-debug-tj2v6\" (UID: \"f805e6f8-2cd4-406d-b67f-33a8f0f4e2be\") " pod="openshift-must-gather-fzk86/crc-debug-tj2v6" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.688517 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4m4\" (UniqueName: \"kubernetes.io/projected/f805e6f8-2cd4-406d-b67f-33a8f0f4e2be-kube-api-access-mm4m4\") pod \"crc-debug-tj2v6\" (UID: \"f805e6f8-2cd4-406d-b67f-33a8f0f4e2be\") " pod="openshift-must-gather-fzk86/crc-debug-tj2v6" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.688877 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f805e6f8-2cd4-406d-b67f-33a8f0f4e2be-host\") pod \"crc-debug-tj2v6\" (UID: \"f805e6f8-2cd4-406d-b67f-33a8f0f4e2be\") " pod="openshift-must-gather-fzk86/crc-debug-tj2v6" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.714651 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4m4\" (UniqueName: \"kubernetes.io/projected/f805e6f8-2cd4-406d-b67f-33a8f0f4e2be-kube-api-access-mm4m4\") pod \"crc-debug-tj2v6\" (UID: \"f805e6f8-2cd4-406d-b67f-33a8f0f4e2be\") " pod="openshift-must-gather-fzk86/crc-debug-tj2v6" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.751557 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/crc-debug-tj2v6" Oct 01 14:24:58 crc kubenswrapper[4913]: I1001 14:24:58.819748 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d9340e-b465-4b53-a53c-35b198b7c794" path="/var/lib/kubelet/pods/76d9340e-b465-4b53-a53c-35b198b7c794/volumes" Oct 01 14:24:59 crc kubenswrapper[4913]: I1001 14:24:59.116073 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fzk86/crc-debug-tj2v6" event={"ID":"f805e6f8-2cd4-406d-b67f-33a8f0f4e2be","Type":"ContainerStarted","Data":"93dfea2f9a478116ae9175f780c12a8d44ba13f8739f82855099285eeb01d7dd"} Oct 01 14:24:59 crc kubenswrapper[4913]: I1001 14:24:59.116448 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fzk86/crc-debug-tj2v6" event={"ID":"f805e6f8-2cd4-406d-b67f-33a8f0f4e2be","Type":"ContainerStarted","Data":"7fb16bc9495d84219be412e6fdc4b1e9f5e63c19546e8314c72b570c5463d37c"} Oct 01 14:25:00 crc kubenswrapper[4913]: I1001 14:25:00.125351 4913 generic.go:334] "Generic (PLEG): container finished" podID="f805e6f8-2cd4-406d-b67f-33a8f0f4e2be" containerID="93dfea2f9a478116ae9175f780c12a8d44ba13f8739f82855099285eeb01d7dd" exitCode=0 Oct 01 14:25:00 crc kubenswrapper[4913]: I1001 14:25:00.125586 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fzk86/crc-debug-tj2v6" event={"ID":"f805e6f8-2cd4-406d-b67f-33a8f0f4e2be","Type":"ContainerDied","Data":"93dfea2f9a478116ae9175f780c12a8d44ba13f8739f82855099285eeb01d7dd"} Oct 01 14:25:01 crc kubenswrapper[4913]: I1001 14:25:01.259318 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/crc-debug-tj2v6" Oct 01 14:25:01 crc kubenswrapper[4913]: I1001 14:25:01.330017 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm4m4\" (UniqueName: \"kubernetes.io/projected/f805e6f8-2cd4-406d-b67f-33a8f0f4e2be-kube-api-access-mm4m4\") pod \"f805e6f8-2cd4-406d-b67f-33a8f0f4e2be\" (UID: \"f805e6f8-2cd4-406d-b67f-33a8f0f4e2be\") " Oct 01 14:25:01 crc kubenswrapper[4913]: I1001 14:25:01.330122 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f805e6f8-2cd4-406d-b67f-33a8f0f4e2be-host\") pod \"f805e6f8-2cd4-406d-b67f-33a8f0f4e2be\" (UID: \"f805e6f8-2cd4-406d-b67f-33a8f0f4e2be\") " Oct 01 14:25:01 crc kubenswrapper[4913]: I1001 14:25:01.330494 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f805e6f8-2cd4-406d-b67f-33a8f0f4e2be-host" (OuterVolumeSpecName: "host") pod "f805e6f8-2cd4-406d-b67f-33a8f0f4e2be" (UID: "f805e6f8-2cd4-406d-b67f-33a8f0f4e2be"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:25:01 crc kubenswrapper[4913]: I1001 14:25:01.330966 4913 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f805e6f8-2cd4-406d-b67f-33a8f0f4e2be-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:25:01 crc kubenswrapper[4913]: I1001 14:25:01.336938 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f805e6f8-2cd4-406d-b67f-33a8f0f4e2be-kube-api-access-mm4m4" (OuterVolumeSpecName: "kube-api-access-mm4m4") pod "f805e6f8-2cd4-406d-b67f-33a8f0f4e2be" (UID: "f805e6f8-2cd4-406d-b67f-33a8f0f4e2be"). InnerVolumeSpecName "kube-api-access-mm4m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:25:01 crc kubenswrapper[4913]: I1001 14:25:01.432243 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm4m4\" (UniqueName: \"kubernetes.io/projected/f805e6f8-2cd4-406d-b67f-33a8f0f4e2be-kube-api-access-mm4m4\") on node \"crc\" DevicePath \"\"" Oct 01 14:25:02 crc kubenswrapper[4913]: I1001 14:25:02.144392 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fzk86/crc-debug-tj2v6" event={"ID":"f805e6f8-2cd4-406d-b67f-33a8f0f4e2be","Type":"ContainerDied","Data":"7fb16bc9495d84219be412e6fdc4b1e9f5e63c19546e8314c72b570c5463d37c"} Oct 01 14:25:02 crc kubenswrapper[4913]: I1001 14:25:02.144436 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fb16bc9495d84219be412e6fdc4b1e9f5e63c19546e8314c72b570c5463d37c" Oct 01 14:25:02 crc kubenswrapper[4913]: I1001 14:25:02.144496 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/crc-debug-tj2v6" Oct 01 14:25:02 crc kubenswrapper[4913]: I1001 14:25:02.807923 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:25:02 crc kubenswrapper[4913]: E1001 14:25:02.808830 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:25:09 crc kubenswrapper[4913]: I1001 14:25:09.399656 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fzk86/crc-debug-tj2v6"] Oct 01 14:25:09 crc kubenswrapper[4913]: I1001 14:25:09.406969 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fzk86/crc-debug-tj2v6"] Oct 01 14:25:10 crc kubenswrapper[4913]: I1001 14:25:10.564512 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fzk86/crc-debug-5bz7b"] Oct 01 14:25:10 crc kubenswrapper[4913]: E1001 14:25:10.565361 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f805e6f8-2cd4-406d-b67f-33a8f0f4e2be" containerName="container-00" Oct 01 14:25:10 crc kubenswrapper[4913]: I1001 14:25:10.565376 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="f805e6f8-2cd4-406d-b67f-33a8f0f4e2be" containerName="container-00" Oct 01 14:25:10 crc kubenswrapper[4913]: I1001 14:25:10.565605 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="f805e6f8-2cd4-406d-b67f-33a8f0f4e2be" containerName="container-00" Oct 01 14:25:10 crc kubenswrapper[4913]: I1001 14:25:10.566467 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/crc-debug-5bz7b" Oct 01 14:25:10 crc kubenswrapper[4913]: I1001 14:25:10.700798 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec238239-f8bf-4452-81fd-f9e5e416989d-host\") pod \"crc-debug-5bz7b\" (UID: \"ec238239-f8bf-4452-81fd-f9e5e416989d\") " pod="openshift-must-gather-fzk86/crc-debug-5bz7b" Oct 01 14:25:10 crc kubenswrapper[4913]: I1001 14:25:10.701189 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4g94\" (UniqueName: \"kubernetes.io/projected/ec238239-f8bf-4452-81fd-f9e5e416989d-kube-api-access-p4g94\") pod \"crc-debug-5bz7b\" (UID: \"ec238239-f8bf-4452-81fd-f9e5e416989d\") " pod="openshift-must-gather-fzk86/crc-debug-5bz7b" Oct 01 14:25:10 crc kubenswrapper[4913]: I1001 14:25:10.803176 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec238239-f8bf-4452-81fd-f9e5e416989d-host\") pod \"crc-debug-5bz7b\" (UID: \"ec238239-f8bf-4452-81fd-f9e5e416989d\") " pod="openshift-must-gather-fzk86/crc-debug-5bz7b" Oct 01 14:25:10 crc kubenswrapper[4913]: I1001 14:25:10.803258 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4g94\" (UniqueName: \"kubernetes.io/projected/ec238239-f8bf-4452-81fd-f9e5e416989d-kube-api-access-p4g94\") pod \"crc-debug-5bz7b\" (UID: \"ec238239-f8bf-4452-81fd-f9e5e416989d\") " pod="openshift-must-gather-fzk86/crc-debug-5bz7b" Oct 01 14:25:10 crc kubenswrapper[4913]: I1001 14:25:10.804107 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec238239-f8bf-4452-81fd-f9e5e416989d-host\") pod \"crc-debug-5bz7b\" (UID: \"ec238239-f8bf-4452-81fd-f9e5e416989d\") " pod="openshift-must-gather-fzk86/crc-debug-5bz7b" Oct 01 14:25:10 crc kubenswrapper[4913]: I1001 14:25:10.820766 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f805e6f8-2cd4-406d-b67f-33a8f0f4e2be" path="/var/lib/kubelet/pods/f805e6f8-2cd4-406d-b67f-33a8f0f4e2be/volumes" Oct 01 14:25:10 crc kubenswrapper[4913]: I1001 14:25:10.824101 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4g94\" (UniqueName: \"kubernetes.io/projected/ec238239-f8bf-4452-81fd-f9e5e416989d-kube-api-access-p4g94\") pod \"crc-debug-5bz7b\" (UID: \"ec238239-f8bf-4452-81fd-f9e5e416989d\") " pod="openshift-must-gather-fzk86/crc-debug-5bz7b" Oct 01 14:25:10 crc kubenswrapper[4913]: I1001 14:25:10.887872 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/crc-debug-5bz7b" Oct 01 14:25:11 crc kubenswrapper[4913]: I1001 14:25:11.240415 4913 generic.go:334] "Generic (PLEG): container finished" podID="ec238239-f8bf-4452-81fd-f9e5e416989d" containerID="63a16f2041608f74d5155ed5e5ec6e941df3680bc79821bc46475bc7020a7434" exitCode=0 Oct 01 14:25:11 crc kubenswrapper[4913]: I1001 14:25:11.240489 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fzk86/crc-debug-5bz7b" event={"ID":"ec238239-f8bf-4452-81fd-f9e5e416989d","Type":"ContainerDied","Data":"63a16f2041608f74d5155ed5e5ec6e941df3680bc79821bc46475bc7020a7434"} Oct 01 14:25:11 crc kubenswrapper[4913]: I1001 14:25:11.240894 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fzk86/crc-debug-5bz7b" event={"ID":"ec238239-f8bf-4452-81fd-f9e5e416989d","Type":"ContainerStarted","Data":"1cbedda2e4b289ce8813f968884e777c4ca02d23a153d27abff607773a8dc9c8"} Oct 01 14:25:11 crc kubenswrapper[4913]: I1001 14:25:11.297042 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fzk86/crc-debug-5bz7b"] Oct 01 14:25:11 crc kubenswrapper[4913]: I1001 14:25:11.310409 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fzk86/crc-debug-5bz7b"] Oct 01 14:25:12 crc kubenswrapper[4913]: I1001 14:25:12.414853 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/crc-debug-5bz7b" Oct 01 14:25:12 crc kubenswrapper[4913]: I1001 14:25:12.534907 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4g94\" (UniqueName: \"kubernetes.io/projected/ec238239-f8bf-4452-81fd-f9e5e416989d-kube-api-access-p4g94\") pod \"ec238239-f8bf-4452-81fd-f9e5e416989d\" (UID: \"ec238239-f8bf-4452-81fd-f9e5e416989d\") " Oct 01 14:25:12 crc kubenswrapper[4913]: I1001 14:25:12.535112 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec238239-f8bf-4452-81fd-f9e5e416989d-host\") pod \"ec238239-f8bf-4452-81fd-f9e5e416989d\" (UID: \"ec238239-f8bf-4452-81fd-f9e5e416989d\") " Oct 01 14:25:12 crc kubenswrapper[4913]: I1001 14:25:12.535316 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec238239-f8bf-4452-81fd-f9e5e416989d-host" (OuterVolumeSpecName: "host") pod "ec238239-f8bf-4452-81fd-f9e5e416989d" (UID: "ec238239-f8bf-4452-81fd-f9e5e416989d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:25:12 crc kubenswrapper[4913]: I1001 14:25:12.535651 4913 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec238239-f8bf-4452-81fd-f9e5e416989d-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:25:12 crc kubenswrapper[4913]: I1001 14:25:12.540575 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec238239-f8bf-4452-81fd-f9e5e416989d-kube-api-access-p4g94" (OuterVolumeSpecName: "kube-api-access-p4g94") pod "ec238239-f8bf-4452-81fd-f9e5e416989d" (UID: "ec238239-f8bf-4452-81fd-f9e5e416989d"). InnerVolumeSpecName "kube-api-access-p4g94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:25:12 crc kubenswrapper[4913]: I1001 14:25:12.637318 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4g94\" (UniqueName: \"kubernetes.io/projected/ec238239-f8bf-4452-81fd-f9e5e416989d-kube-api-access-p4g94\") on node \"crc\" DevicePath \"\"" Oct 01 14:25:12 crc kubenswrapper[4913]: I1001 14:25:12.817851 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec238239-f8bf-4452-81fd-f9e5e416989d" path="/var/lib/kubelet/pods/ec238239-f8bf-4452-81fd-f9e5e416989d/volumes" Oct 01 14:25:12 crc kubenswrapper[4913]: I1001 14:25:12.947043 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt_848f3124-2a1f-45fa-bb83-893c3db866ae/util/0.log" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.169090 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt_848f3124-2a1f-45fa-bb83-893c3db866ae/util/0.log" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.192780 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt_848f3124-2a1f-45fa-bb83-893c3db866ae/pull/0.log" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.202007 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt_848f3124-2a1f-45fa-bb83-893c3db866ae/pull/0.log" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.261716 4913 scope.go:117] "RemoveContainer" containerID="63a16f2041608f74d5155ed5e5ec6e941df3680bc79821bc46475bc7020a7434" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.261760 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/crc-debug-5bz7b" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.388103 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt_848f3124-2a1f-45fa-bb83-893c3db866ae/util/0.log" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.400930 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt_848f3124-2a1f-45fa-bb83-893c3db866ae/extract/0.log" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.401829 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0e56acba034a4fe5e21c87878f7cabfd0ead2befdef111b141ca11c71a5htdt_848f3124-2a1f-45fa-bb83-893c3db866ae/pull/0.log" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.575637 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f7f98cb69-wdw6b_086685ca-996d-44d9-bd02-33cc99e5dab9/kube-rbac-proxy/0.log" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.685614 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859cd486d-p6xjl_9d418cfb-ec47-4ba7-b29b-5e68fddf11e4/kube-rbac-proxy/0.log" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.696788 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f7f98cb69-wdw6b_086685ca-996d-44d9-bd02-33cc99e5dab9/manager/0.log" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.807195 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:25:13 crc kubenswrapper[4913]: E1001 14:25:13.807556 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.861873 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859cd486d-p6xjl_9d418cfb-ec47-4ba7-b29b-5e68fddf11e4/manager/0.log" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.879819 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77fb7bcf5b-pbjqc_cb312f93-72ce-4d3e-9d89-66526e40dca2/kube-rbac-proxy/0.log" Oct 01 14:25:13 crc kubenswrapper[4913]: I1001 14:25:13.941945 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77fb7bcf5b-pbjqc_cb312f93-72ce-4d3e-9d89-66526e40dca2/manager/0.log" Oct 01 14:25:14 crc kubenswrapper[4913]: I1001 14:25:14.038380 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8bc4775b5-rxgj5_55b36494-1091-44b0-b303-ac62e5cef841/kube-rbac-proxy/0.log" Oct 01 14:25:14 crc kubenswrapper[4913]: I1001 14:25:14.129427 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8bc4775b5-rxgj5_55b36494-1091-44b0-b303-ac62e5cef841/manager/0.log" Oct 01 14:25:14 crc kubenswrapper[4913]: I1001 14:25:14.261605 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b4fc86755-gq8sp_d7b4ec90-a547-48a1-83ac-3528c53f90f0/kube-rbac-proxy/0.log" Oct 01 14:25:14 crc kubenswrapper[4913]: I1001 14:25:14.277649 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b4fc86755-gq8sp_d7b4ec90-a547-48a1-83ac-3528c53f90f0/manager/0.log" Oct 01 14:25:14 crc kubenswrapper[4913]: I1001 14:25:14.423279 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-679b4759bb-mh5sx_845f41f1-b619-4984-aa2a-bae46992d463/kube-rbac-proxy/0.log" Oct 01 14:25:14 crc kubenswrapper[4913]: I1001 14:25:14.464644 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-679b4759bb-mh5sx_845f41f1-b619-4984-aa2a-bae46992d463/manager/0.log" Oct 01 14:25:14 crc kubenswrapper[4913]: I1001 14:25:14.583662 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5c8fdc4d5c-2ztjf_5ac2ed71-cb90-4003-b8f9-5ad6748c08d5/kube-rbac-proxy/0.log" Oct 01 14:25:14 crc kubenswrapper[4913]: I1001 14:25:14.760348 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f45cd594f-hdrq8_5c79503f-3029-498b-b9b1-e2df43820cb2/kube-rbac-proxy/0.log" Oct 01 14:25:14 crc kubenswrapper[4913]: I1001 14:25:14.774926 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5c8fdc4d5c-2ztjf_5ac2ed71-cb90-4003-b8f9-5ad6748c08d5/manager/0.log" Oct 01 14:25:14 crc kubenswrapper[4913]: I1001 14:25:14.793621 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f45cd594f-hdrq8_5c79503f-3029-498b-b9b1-e2df43820cb2/manager/0.log" Oct 01 14:25:15 crc kubenswrapper[4913]: I1001 14:25:15.109463 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-59d7dc95cf-s7zlm_e5ec32c8-323f-4c74-bf82-4dc2a70db41a/kube-rbac-proxy/0.log" Oct 01 14:25:15 crc kubenswrapper[4913]: I1001 14:25:15.219125 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-59d7dc95cf-s7zlm_e5ec32c8-323f-4c74-bf82-4dc2a70db41a/manager/0.log" Oct 01 14:25:15 crc kubenswrapper[4913]: I1001 14:25:15.322518 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-b7cf8cb5f-5jvz7_13e74825-5720-4c49-97e9-a0fccf649b50/kube-rbac-proxy/0.log" Oct 01 14:25:15 crc kubenswrapper[4913]: I1001 14:25:15.392966 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-b7cf8cb5f-5jvz7_13e74825-5720-4c49-97e9-a0fccf649b50/manager/0.log" Oct 01 14:25:15 crc kubenswrapper[4913]: I1001 14:25:15.439236 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf5bb885-nsm78_8f9b91c6-b4e9-44b3-83c2-42412d48de96/kube-rbac-proxy/0.log" Oct 01 14:25:15 crc kubenswrapper[4913]: I1001 14:25:15.593372 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf5bb885-nsm78_8f9b91c6-b4e9-44b3-83c2-42412d48de96/manager/0.log" Oct 01 14:25:15 crc kubenswrapper[4913]: I1001 14:25:15.665913 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54fbbfcd44-f8tw7_e80ad896-dd86-43e6-850b-f12088a61cf5/kube-rbac-proxy/0.log" Oct 01 14:25:15 crc kubenswrapper[4913]: I1001 14:25:15.735693 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54fbbfcd44-f8tw7_e80ad896-dd86-43e6-850b-f12088a61cf5/manager/0.log" Oct 01 14:25:15 crc kubenswrapper[4913]: I1001 14:25:15.840176 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7fd5b6bbc6-45w5m_a34a6147-f982-48cf-9976-b39c4dd420cf/kube-rbac-proxy/0.log" Oct 01 14:25:15 crc kubenswrapper[4913]: I1001 14:25:15.916328 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7fd5b6bbc6-45w5m_a34a6147-f982-48cf-9976-b39c4dd420cf/manager/0.log" Oct 01 14:25:16 crc kubenswrapper[4913]: I1001 14:25:16.105470 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-75f8d67d86-pmhs8_b7681632-31dc-4278-889a-8b89ddccac74/kube-rbac-proxy/0.log" Oct 01 14:25:16 crc kubenswrapper[4913]: I1001 14:25:16.146101 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-75f8d67d86-pmhs8_b7681632-31dc-4278-889a-8b89ddccac74/manager/0.log" Oct 01 14:25:16 crc kubenswrapper[4913]: I1001 14:25:16.195076 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-659bb84579hvlsl_3f17e8a8-3251-4a03-8f8d-70698d5146b3/kube-rbac-proxy/0.log" Oct 01 14:25:16 crc kubenswrapper[4913]: I1001 14:25:16.335760 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-659bb84579hvlsl_3f17e8a8-3251-4a03-8f8d-70698d5146b3/manager/0.log" Oct 01 14:25:16 crc kubenswrapper[4913]: I1001 14:25:16.341128 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6c7b6bcb7c-dvhfd_ad1d9777-734a-412f-a917-8d6b497dcb32/kube-rbac-proxy/0.log" Oct 01 14:25:16 crc kubenswrapper[4913]: I1001 14:25:16.572865 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6477d86654-p7vhq_95583d77-cff7-452a-85eb-0674b64df62a/kube-rbac-proxy/0.log" Oct 01 14:25:16 crc kubenswrapper[4913]: I1001 14:25:16.622778 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6477d86654-p7vhq_95583d77-cff7-452a-85eb-0674b64df62a/operator/0.log" Oct 01 14:25:16 crc kubenswrapper[4913]: I1001 14:25:16.921355 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84c745747f-f6wpw_f28004e7-4e00-4ffc-ae3a-cfad4022387a/kube-rbac-proxy/0.log" Oct 01 14:25:16 crc kubenswrapper[4913]: I1001 14:25:16.954690 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xptgj_5262687f-5e25-4632-8122-9e15fb72e8d9/registry-server/0.log" Oct 01 14:25:17 crc kubenswrapper[4913]: I1001 14:25:17.044443 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84c745747f-f6wpw_f28004e7-4e00-4ffc-ae3a-cfad4022387a/manager/0.log" Oct 01 14:25:17 crc kubenswrapper[4913]: I1001 14:25:17.170475 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-598c4c8547-pvvml_61fd19a3-0b06-401a-9bf3-9d2a34bbf291/kube-rbac-proxy/0.log" Oct 01 14:25:17 crc kubenswrapper[4913]: I1001 14:25:17.231939 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-598c4c8547-pvvml_61fd19a3-0b06-401a-9bf3-9d2a34bbf291/manager/0.log" Oct 01 14:25:17 crc kubenswrapper[4913]: I1001 14:25:17.342830 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-2mpn5_b0d4dbf5-aa13-46ec-ab24-5c43a0be638c/operator/0.log" Oct 01 14:25:17 crc kubenswrapper[4913]: I1001 14:25:17.425366 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-689b4f76c9-8668d_a19e1891-5907-42dc-9894-6c9b3bcb5cce/kube-rbac-proxy/0.log" Oct 01 14:25:17 crc kubenswrapper[4913]: I1001 14:25:17.492754 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-689b4f76c9-8668d_a19e1891-5907-42dc-9894-6c9b3bcb5cce/manager/0.log" Oct 01 14:25:17 crc kubenswrapper[4913]: I1001 14:25:17.639189 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-cb66d6b59-2djx2_1549da87-3151-467a-92d6-de0709a3a6a7/kube-rbac-proxy/0.log" Oct 01 14:25:17 crc kubenswrapper[4913]: I1001 14:25:17.641953 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6c7b6bcb7c-dvhfd_ad1d9777-734a-412f-a917-8d6b497dcb32/manager/0.log" Oct 01 14:25:17 crc kubenswrapper[4913]: I1001 14:25:17.745007 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-cb66d6b59-2djx2_1549da87-3151-467a-92d6-de0709a3a6a7/manager/0.log" Oct 01 14:25:17 crc kubenswrapper[4913]: I1001 14:25:17.867501 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5846bf4994-z259j_fd53fc8f-4ea6-4bd8-9805-af0f7e442bc2/kube-rbac-proxy/0.log" Oct 01 14:25:17 crc kubenswrapper[4913]: I1001 14:25:17.879371 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5846bf4994-z259j_fd53fc8f-4ea6-4bd8-9805-af0f7e442bc2/manager/0.log" Oct 01 14:25:17 crc kubenswrapper[4913]: I1001 14:25:17.945203 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-68d7bc5569-4g45w_e0ef9126-ad52-4803-b11d-f8b1712c4efd/kube-rbac-proxy/0.log" Oct 01 14:25:17 crc kubenswrapper[4913]: I1001 14:25:17.977966 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-68d7bc5569-4g45w_e0ef9126-ad52-4803-b11d-f8b1712c4efd/manager/0.log" Oct 01 14:25:28 crc kubenswrapper[4913]: I1001 14:25:28.816024 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:25:28 crc kubenswrapper[4913]: E1001 14:25:28.816956 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:25:31 crc kubenswrapper[4913]: I1001 14:25:31.841948 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bkgdg_9689e1f9-5b48-47da-af2f-dc1db858196d/control-plane-machine-set-operator/0.log" Oct 01 14:25:32 crc kubenswrapper[4913]: I1001 14:25:32.067978 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fm7mq_92b2bff6-3b61-4d3b-8d88-9077b02ed990/machine-api-operator/0.log" Oct 01 14:25:32 crc kubenswrapper[4913]: I1001 14:25:32.201915 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fm7mq_92b2bff6-3b61-4d3b-8d88-9077b02ed990/kube-rbac-proxy/0.log" Oct 01 14:25:42 crc kubenswrapper[4913]: I1001 14:25:42.890139 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-n8tcp_f6dbba57-881b-4fcf-8c71-4a7aa5eb7bd7/cert-manager-controller/0.log" Oct 01 14:25:43 crc kubenswrapper[4913]: I1001 14:25:43.023992 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-m2cpf_30c02957-87a8-4ab3-bbcf-9248f4c9ffc6/cert-manager-cainjector/0.log" Oct 01 14:25:43 crc kubenswrapper[4913]: I1001 14:25:43.092352 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-2z957_47e455bc-ca7f-42fc-a85d-720561425b25/cert-manager-webhook/0.log" Oct 01 14:25:43 crc kubenswrapper[4913]: I1001 14:25:43.806508 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:25:43 crc kubenswrapper[4913]: E1001 14:25:43.806760 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:25:50 crc kubenswrapper[4913]: E1001 14:25:50.808496 4913 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 01 14:25:53 crc kubenswrapper[4913]: I1001 14:25:53.636593 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-nh8ft_e6565ecd-6027-4555-888c-da3a16c20260/nmstate-console-plugin/0.log" Oct 01 14:25:53 crc kubenswrapper[4913]: I1001 14:25:53.813676 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-trqgg_0ca9e5b5-947e-426a-81cc-8ce9774da263/nmstate-handler/0.log" Oct 01 14:25:53 crc kubenswrapper[4913]: I1001 14:25:53.835689 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-r6h59_86077535-3ad3-414d-ad9d-5b0107ec2cf0/kube-rbac-proxy/0.log" Oct 01 14:25:53 crc kubenswrapper[4913]: I1001 14:25:53.885430 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-r6h59_86077535-3ad3-414d-ad9d-5b0107ec2cf0/nmstate-metrics/0.log" Oct 01 14:25:54 crc kubenswrapper[4913]: I1001 14:25:54.054366 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-bnx22_bad76a8b-0b0d-4a6e-870d-14f138beb4fb/nmstate-webhook/0.log" Oct 01 14:25:54 crc kubenswrapper[4913]: I1001 14:25:54.086348 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-s6pxh_652be88b-3bb3-4c4d-9aa7-bc1494c53cb3/nmstate-operator/0.log" Oct 01 14:25:54 crc kubenswrapper[4913]: I1001 14:25:54.807675 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:25:54 crc kubenswrapper[4913]: E1001 14:25:54.808006 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.360332 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qcbgr"] Oct 01 14:26:01 crc kubenswrapper[4913]: E1001 14:26:01.362747 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec238239-f8bf-4452-81fd-f9e5e416989d" containerName="container-00" Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.362880 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec238239-f8bf-4452-81fd-f9e5e416989d" containerName="container-00" Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.363176 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec238239-f8bf-4452-81fd-f9e5e416989d" containerName="container-00" Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.365077 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.369724 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcbgr"] Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.497684 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28e92916-51c7-4c12-a2df-af8ef7b33521-utilities\") pod \"redhat-marketplace-qcbgr\" (UID: \"28e92916-51c7-4c12-a2df-af8ef7b33521\") " pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.497760 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28e92916-51c7-4c12-a2df-af8ef7b33521-catalog-content\") pod \"redhat-marketplace-qcbgr\" (UID: \"28e92916-51c7-4c12-a2df-af8ef7b33521\") " pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.497833 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx5ff\" (UniqueName: \"kubernetes.io/projected/28e92916-51c7-4c12-a2df-af8ef7b33521-kube-api-access-rx5ff\") pod \"redhat-marketplace-qcbgr\" (UID: \"28e92916-51c7-4c12-a2df-af8ef7b33521\") " pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.600186 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx5ff\" (UniqueName: \"kubernetes.io/projected/28e92916-51c7-4c12-a2df-af8ef7b33521-kube-api-access-rx5ff\") pod \"redhat-marketplace-qcbgr\" (UID: \"28e92916-51c7-4c12-a2df-af8ef7b33521\") " pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.600401 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28e92916-51c7-4c12-a2df-af8ef7b33521-utilities\") pod \"redhat-marketplace-qcbgr\" (UID: \"28e92916-51c7-4c12-a2df-af8ef7b33521\") " pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.600446 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28e92916-51c7-4c12-a2df-af8ef7b33521-catalog-content\") pod \"redhat-marketplace-qcbgr\" (UID: \"28e92916-51c7-4c12-a2df-af8ef7b33521\") " pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.600902 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28e92916-51c7-4c12-a2df-af8ef7b33521-utilities\") pod \"redhat-marketplace-qcbgr\" (UID: \"28e92916-51c7-4c12-a2df-af8ef7b33521\") " pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.600991 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28e92916-51c7-4c12-a2df-af8ef7b33521-catalog-content\") pod \"redhat-marketplace-qcbgr\" (UID: \"28e92916-51c7-4c12-a2df-af8ef7b33521\") " pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.618586 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx5ff\" (UniqueName: \"kubernetes.io/projected/28e92916-51c7-4c12-a2df-af8ef7b33521-kube-api-access-rx5ff\") pod \"redhat-marketplace-qcbgr\" (UID: \"28e92916-51c7-4c12-a2df-af8ef7b33521\") " pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:01 crc kubenswrapper[4913]: I1001 14:26:01.720843 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:02 crc kubenswrapper[4913]: I1001 14:26:02.168517 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcbgr"] Oct 01 14:26:02 crc kubenswrapper[4913]: I1001 14:26:02.701769 4913 generic.go:334] "Generic (PLEG): container finished" podID="28e92916-51c7-4c12-a2df-af8ef7b33521" containerID="8aebbbe1e790f3cfa5661dd43a23f9d91663de27fc2a107c93b974a105edf09b" exitCode=0 Oct 01 14:26:02 crc kubenswrapper[4913]: I1001 14:26:02.701867 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcbgr" event={"ID":"28e92916-51c7-4c12-a2df-af8ef7b33521","Type":"ContainerDied","Data":"8aebbbe1e790f3cfa5661dd43a23f9d91663de27fc2a107c93b974a105edf09b"} Oct 01 14:26:02 crc kubenswrapper[4913]: I1001 14:26:02.702075 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcbgr" event={"ID":"28e92916-51c7-4c12-a2df-af8ef7b33521","Type":"ContainerStarted","Data":"de4a86cd941f19cde82458667e07e2d97ef7a5e2ed268fff84614f1ab66ac14c"} Oct 01 14:26:04 crc kubenswrapper[4913]: I1001 14:26:04.721252 4913 generic.go:334] "Generic (PLEG): container finished" podID="28e92916-51c7-4c12-a2df-af8ef7b33521" containerID="7719f54f82073b06d7e650fd7321c8504d76ef199e8e517bd73100f8a41f66d0" exitCode=0 Oct 01 14:26:04 crc kubenswrapper[4913]: I1001 14:26:04.721931 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcbgr" event={"ID":"28e92916-51c7-4c12-a2df-af8ef7b33521","Type":"ContainerDied","Data":"7719f54f82073b06d7e650fd7321c8504d76ef199e8e517bd73100f8a41f66d0"} Oct 01 14:26:05 crc kubenswrapper[4913]: I1001 14:26:05.732200 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcbgr" event={"ID":"28e92916-51c7-4c12-a2df-af8ef7b33521","Type":"ContainerStarted","Data":"ac764473cf3ebe2c09ae63337f8ee5ac165826de521d082a680a976d34ce8a07"} Oct 01 14:26:05 crc kubenswrapper[4913]: I1001 14:26:05.751328 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qcbgr" podStartSLOduration=2.143855395 podStartE2EDuration="4.751303674s" podCreationTimestamp="2025-10-01 14:26:01 +0000 UTC" firstStartedPulling="2025-10-01 14:26:02.703771007 +0000 UTC m=+6494.607246575" lastFinishedPulling="2025-10-01 14:26:05.311219256 +0000 UTC m=+6497.214694854" observedRunningTime="2025-10-01 14:26:05.745565416 +0000 UTC m=+6497.649041004" watchObservedRunningTime="2025-10-01 14:26:05.751303674 +0000 UTC m=+6497.654779252" Oct 01 14:26:06 crc kubenswrapper[4913]: I1001 14:26:06.791940 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-pzf47_4f13db7d-cdb3-47bf-84db-d78e4f620eb9/kube-rbac-proxy/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.004337 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-pzf47_4f13db7d-cdb3-47bf-84db-d78e4f620eb9/controller/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.049888 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/cp-frr-files/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.258286 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/cp-frr-files/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.279617 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/cp-reloader/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.324732 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/cp-metrics/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.324785 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/cp-reloader/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.504639 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/cp-reloader/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.519962 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/cp-metrics/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.520042 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/cp-frr-files/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.538548 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/cp-metrics/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.711364 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/cp-reloader/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.732360 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/cp-frr-files/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.732969 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/cp-metrics/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.776230 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/controller/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.931356 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/frr-metrics/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.957526 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/kube-rbac-proxy-frr/0.log" Oct 01 14:26:07 crc kubenswrapper[4913]: I1001 14:26:07.981218 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/kube-rbac-proxy/0.log" Oct 01 14:26:08 crc kubenswrapper[4913]: I1001 14:26:08.224664 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/reloader/0.log" Oct 01 14:26:08 crc kubenswrapper[4913]: I1001 14:26:08.227649 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-kjqpz_cf61dc53-493d-4c23-b13a-a4a496d1014d/frr-k8s-webhook-server/0.log" Oct 01 14:26:08 crc kubenswrapper[4913]: I1001 14:26:08.462871 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-74c9fc44b9-z2f2n_3028f0ec-9b00-468d-919b-5ed3b066bade/manager/0.log" Oct 01 14:26:08 crc kubenswrapper[4913]: I1001 14:26:08.685115 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nrghj_cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7/kube-rbac-proxy/0.log" Oct 01 14:26:08 crc kubenswrapper[4913]: I1001 14:26:08.749410 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-64d4d5b6f-d7bsw_7bc350db-085b-4d87-a4a8-cb77c25746f9/webhook-server/0.log" Oct 01 14:26:08 crc kubenswrapper[4913]: I1001 14:26:08.848659 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:26:08 crc kubenswrapper[4913]: E1001 14:26:08.849227 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:26:09 crc kubenswrapper[4913]: I1001 14:26:09.553187 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nrghj_cc9b5c10-d3bd-4f45-abc2-d0462bbf93f7/speaker/0.log" Oct 01 14:26:09 crc kubenswrapper[4913]: I1001 14:26:09.963666 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ksptf_74154ce8-d469-4b0a-98f2-23206e3939a4/frr/0.log" Oct 01 14:26:11 crc kubenswrapper[4913]: I1001 14:26:11.721306 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:11 crc kubenswrapper[4913]: I1001 14:26:11.721626 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:11 crc kubenswrapper[4913]: I1001 14:26:11.772616 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:11 crc kubenswrapper[4913]: I1001 14:26:11.831919 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:12 crc kubenswrapper[4913]: I1001 14:26:12.007446 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcbgr"] Oct 01 14:26:13 crc kubenswrapper[4913]: I1001 14:26:13.807237 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qcbgr" podUID="28e92916-51c7-4c12-a2df-af8ef7b33521" containerName="registry-server" containerID="cri-o://ac764473cf3ebe2c09ae63337f8ee5ac165826de521d082a680a976d34ce8a07" gracePeriod=2 Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.274078 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.440588 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28e92916-51c7-4c12-a2df-af8ef7b33521-catalog-content\") pod \"28e92916-51c7-4c12-a2df-af8ef7b33521\" (UID: \"28e92916-51c7-4c12-a2df-af8ef7b33521\") " Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.440743 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28e92916-51c7-4c12-a2df-af8ef7b33521-utilities\") pod \"28e92916-51c7-4c12-a2df-af8ef7b33521\" (UID: \"28e92916-51c7-4c12-a2df-af8ef7b33521\") " Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.440849 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx5ff\" (UniqueName: \"kubernetes.io/projected/28e92916-51c7-4c12-a2df-af8ef7b33521-kube-api-access-rx5ff\") pod \"28e92916-51c7-4c12-a2df-af8ef7b33521\" (UID: \"28e92916-51c7-4c12-a2df-af8ef7b33521\") " Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.441513 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28e92916-51c7-4c12-a2df-af8ef7b33521-utilities" (OuterVolumeSpecName: "utilities") pod "28e92916-51c7-4c12-a2df-af8ef7b33521" (UID: "28e92916-51c7-4c12-a2df-af8ef7b33521"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.448984 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e92916-51c7-4c12-a2df-af8ef7b33521-kube-api-access-rx5ff" (OuterVolumeSpecName: "kube-api-access-rx5ff") pod "28e92916-51c7-4c12-a2df-af8ef7b33521" (UID: "28e92916-51c7-4c12-a2df-af8ef7b33521"). InnerVolumeSpecName "kube-api-access-rx5ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.454852 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28e92916-51c7-4c12-a2df-af8ef7b33521-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28e92916-51c7-4c12-a2df-af8ef7b33521" (UID: "28e92916-51c7-4c12-a2df-af8ef7b33521"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.543316 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28e92916-51c7-4c12-a2df-af8ef7b33521-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.543353 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx5ff\" (UniqueName: \"kubernetes.io/projected/28e92916-51c7-4c12-a2df-af8ef7b33521-kube-api-access-rx5ff\") on node \"crc\" DevicePath \"\"" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.543363 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28e92916-51c7-4c12-a2df-af8ef7b33521-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.821729 4913 generic.go:334] "Generic (PLEG): container finished" podID="28e92916-51c7-4c12-a2df-af8ef7b33521" containerID="ac764473cf3ebe2c09ae63337f8ee5ac165826de521d082a680a976d34ce8a07" exitCode=0 Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.822067 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qcbgr" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.822342 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcbgr" event={"ID":"28e92916-51c7-4c12-a2df-af8ef7b33521","Type":"ContainerDied","Data":"ac764473cf3ebe2c09ae63337f8ee5ac165826de521d082a680a976d34ce8a07"} Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.822392 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcbgr" event={"ID":"28e92916-51c7-4c12-a2df-af8ef7b33521","Type":"ContainerDied","Data":"de4a86cd941f19cde82458667e07e2d97ef7a5e2ed268fff84614f1ab66ac14c"} Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.822416 4913 scope.go:117] "RemoveContainer" containerID="ac764473cf3ebe2c09ae63337f8ee5ac165826de521d082a680a976d34ce8a07" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.865021 4913 scope.go:117] "RemoveContainer" containerID="7719f54f82073b06d7e650fd7321c8504d76ef199e8e517bd73100f8a41f66d0" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.868927 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcbgr"] Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.881088 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcbgr"] Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.888499 4913 scope.go:117] "RemoveContainer" containerID="8aebbbe1e790f3cfa5661dd43a23f9d91663de27fc2a107c93b974a105edf09b" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.936481 4913 scope.go:117] "RemoveContainer" containerID="ac764473cf3ebe2c09ae63337f8ee5ac165826de521d082a680a976d34ce8a07" Oct 01 14:26:14 crc kubenswrapper[4913]: E1001 14:26:14.937013 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac764473cf3ebe2c09ae63337f8ee5ac165826de521d082a680a976d34ce8a07\": container with ID starting with ac764473cf3ebe2c09ae63337f8ee5ac165826de521d082a680a976d34ce8a07 not found: ID does not exist" containerID="ac764473cf3ebe2c09ae63337f8ee5ac165826de521d082a680a976d34ce8a07" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.937049 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac764473cf3ebe2c09ae63337f8ee5ac165826de521d082a680a976d34ce8a07"} err="failed to get container status \"ac764473cf3ebe2c09ae63337f8ee5ac165826de521d082a680a976d34ce8a07\": rpc error: code = NotFound desc = could not find container \"ac764473cf3ebe2c09ae63337f8ee5ac165826de521d082a680a976d34ce8a07\": container with ID starting with ac764473cf3ebe2c09ae63337f8ee5ac165826de521d082a680a976d34ce8a07 not found: ID does not exist" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.937102 4913 scope.go:117] "RemoveContainer" containerID="7719f54f82073b06d7e650fd7321c8504d76ef199e8e517bd73100f8a41f66d0" Oct 01 14:26:14 crc kubenswrapper[4913]: E1001 14:26:14.937519 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7719f54f82073b06d7e650fd7321c8504d76ef199e8e517bd73100f8a41f66d0\": container with ID starting with 7719f54f82073b06d7e650fd7321c8504d76ef199e8e517bd73100f8a41f66d0 not found: ID does not exist" containerID="7719f54f82073b06d7e650fd7321c8504d76ef199e8e517bd73100f8a41f66d0" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.937547 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7719f54f82073b06d7e650fd7321c8504d76ef199e8e517bd73100f8a41f66d0"} err="failed to get container status \"7719f54f82073b06d7e650fd7321c8504d76ef199e8e517bd73100f8a41f66d0\": rpc error: code = NotFound desc = could not find container \"7719f54f82073b06d7e650fd7321c8504d76ef199e8e517bd73100f8a41f66d0\": container with ID starting with 7719f54f82073b06d7e650fd7321c8504d76ef199e8e517bd73100f8a41f66d0 not found: ID does not exist" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.937567 4913 scope.go:117] "RemoveContainer" containerID="8aebbbe1e790f3cfa5661dd43a23f9d91663de27fc2a107c93b974a105edf09b" Oct 01 14:26:14 crc kubenswrapper[4913]: E1001 14:26:14.937817 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aebbbe1e790f3cfa5661dd43a23f9d91663de27fc2a107c93b974a105edf09b\": container with ID starting with 8aebbbe1e790f3cfa5661dd43a23f9d91663de27fc2a107c93b974a105edf09b not found: ID does not exist" containerID="8aebbbe1e790f3cfa5661dd43a23f9d91663de27fc2a107c93b974a105edf09b" Oct 01 14:26:14 crc kubenswrapper[4913]: I1001 14:26:14.937843 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aebbbe1e790f3cfa5661dd43a23f9d91663de27fc2a107c93b974a105edf09b"} err="failed to get container status \"8aebbbe1e790f3cfa5661dd43a23f9d91663de27fc2a107c93b974a105edf09b\": rpc error: code = NotFound desc = could not find container \"8aebbbe1e790f3cfa5661dd43a23f9d91663de27fc2a107c93b974a105edf09b\": container with ID starting with 8aebbbe1e790f3cfa5661dd43a23f9d91663de27fc2a107c93b974a105edf09b not found: ID does not exist" Oct 01 14:26:16 crc kubenswrapper[4913]: I1001 14:26:16.817303 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e92916-51c7-4c12-a2df-af8ef7b33521" path="/var/lib/kubelet/pods/28e92916-51c7-4c12-a2df-af8ef7b33521/volumes" Oct 01 14:26:19 crc kubenswrapper[4913]: I1001 14:26:19.806472 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:26:19 crc kubenswrapper[4913]: E1001 14:26:19.807262 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:26:20 crc kubenswrapper[4913]: I1001 14:26:20.556026 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg_e3a0a677-1707-4c62-8561-4b1fa7ac7b43/util/0.log" Oct 01 14:26:20 crc kubenswrapper[4913]: I1001 14:26:20.693133 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg_e3a0a677-1707-4c62-8561-4b1fa7ac7b43/util/0.log" Oct 01 14:26:20 crc kubenswrapper[4913]: I1001 14:26:20.717530 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg_e3a0a677-1707-4c62-8561-4b1fa7ac7b43/pull/0.log" Oct 01 14:26:20 crc kubenswrapper[4913]: I1001 14:26:20.726110 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg_e3a0a677-1707-4c62-8561-4b1fa7ac7b43/pull/0.log" Oct 01 14:26:20 crc kubenswrapper[4913]: I1001 14:26:20.903175 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg_e3a0a677-1707-4c62-8561-4b1fa7ac7b43/util/0.log" Oct 01 14:26:20 crc kubenswrapper[4913]: I1001 14:26:20.924589 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg_e3a0a677-1707-4c62-8561-4b1fa7ac7b43/pull/0.log" Oct 01 14:26:20 crc kubenswrapper[4913]: I1001 14:26:20.936705 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbpbfg_e3a0a677-1707-4c62-8561-4b1fa7ac7b43/extract/0.log" Oct 01 14:26:21 crc kubenswrapper[4913]: I1001 14:26:21.062562 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mrpb7_5f4d496a-b3d5-49d0-88bb-aa061f342fd3/extract-utilities/0.log" Oct 01 14:26:21 crc kubenswrapper[4913]: I1001 14:26:21.280974 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mrpb7_5f4d496a-b3d5-49d0-88bb-aa061f342fd3/extract-content/0.log" Oct 01 14:26:21 crc kubenswrapper[4913]: I1001 14:26:21.282541 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mrpb7_5f4d496a-b3d5-49d0-88bb-aa061f342fd3/extract-utilities/0.log" Oct 01 14:26:21 crc kubenswrapper[4913]: I1001 14:26:21.287461 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mrpb7_5f4d496a-b3d5-49d0-88bb-aa061f342fd3/extract-content/0.log" Oct 01 14:26:21 crc kubenswrapper[4913]: I1001 14:26:21.464871 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mrpb7_5f4d496a-b3d5-49d0-88bb-aa061f342fd3/extract-utilities/0.log" Oct 01 14:26:21 crc kubenswrapper[4913]: I1001 14:26:21.470427 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mrpb7_5f4d496a-b3d5-49d0-88bb-aa061f342fd3/extract-content/0.log" Oct 01 14:26:21 crc kubenswrapper[4913]: I1001 14:26:21.649674 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dl8lx_ed53f181-8b36-4a70-a904-871780dda5cf/extract-utilities/0.log" Oct 01 14:26:21 crc kubenswrapper[4913]: I1001 14:26:21.859992 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dl8lx_ed53f181-8b36-4a70-a904-871780dda5cf/extract-content/0.log" Oct 01 14:26:21 crc kubenswrapper[4913]: I1001 14:26:21.868854 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dl8lx_ed53f181-8b36-4a70-a904-871780dda5cf/extract-utilities/0.log" Oct 01 14:26:21 crc kubenswrapper[4913]: I1001 14:26:21.957982 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dl8lx_ed53f181-8b36-4a70-a904-871780dda5cf/extract-content/0.log" Oct 01 14:26:22 crc kubenswrapper[4913]: I1001 14:26:22.156411 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dl8lx_ed53f181-8b36-4a70-a904-871780dda5cf/extract-utilities/0.log" Oct 01 14:26:22 crc kubenswrapper[4913]: I1001 14:26:22.216953 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dl8lx_ed53f181-8b36-4a70-a904-871780dda5cf/extract-content/0.log" Oct 01 14:26:22 crc kubenswrapper[4913]: I1001 14:26:22.504344 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j_a7dfff20-4135-47df-a052-3cf904ea1263/util/0.log" Oct 01 14:26:22 crc kubenswrapper[4913]: I1001 14:26:22.717432 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j_a7dfff20-4135-47df-a052-3cf904ea1263/util/0.log" Oct 01 14:26:22 crc kubenswrapper[4913]: I1001 14:26:22.719138 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j_a7dfff20-4135-47df-a052-3cf904ea1263/pull/0.log" Oct 01 14:26:22 crc kubenswrapper[4913]: I1001 14:26:22.724804 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mrpb7_5f4d496a-b3d5-49d0-88bb-aa061f342fd3/registry-server/0.log" Oct 01 14:26:22 crc kubenswrapper[4913]: I1001 14:26:22.994041 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j_a7dfff20-4135-47df-a052-3cf904ea1263/pull/0.log" Oct 01 14:26:23 crc kubenswrapper[4913]: I1001 14:26:23.214140 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j_a7dfff20-4135-47df-a052-3cf904ea1263/util/0.log" Oct 01 14:26:23 crc kubenswrapper[4913]: I1001 14:26:23.217034 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j_a7dfff20-4135-47df-a052-3cf904ea1263/extract/0.log" Oct 01 14:26:23 crc kubenswrapper[4913]: I1001 14:26:23.251166 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96l4q9j_a7dfff20-4135-47df-a052-3cf904ea1263/pull/0.log" Oct 01 14:26:23 crc kubenswrapper[4913]: I1001 14:26:23.315874 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dl8lx_ed53f181-8b36-4a70-a904-871780dda5cf/registry-server/0.log" Oct 01 14:26:23 crc kubenswrapper[4913]: I1001 14:26:23.532522 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xgpws_6a6a11a5-0c37-4537-9e97-4ef59ad7bc38/marketplace-operator/0.log" Oct 01 14:26:23 crc kubenswrapper[4913]: I1001 14:26:23.564627 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mxbn7_ed62703c-99a3-4c2f-8b04-286e67063932/extract-utilities/0.log" Oct 01 14:26:23 crc kubenswrapper[4913]: I1001 14:26:23.751126 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mxbn7_ed62703c-99a3-4c2f-8b04-286e67063932/extract-utilities/0.log" Oct 01 14:26:23 crc kubenswrapper[4913]: I1001 14:26:23.774929 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mxbn7_ed62703c-99a3-4c2f-8b04-286e67063932/extract-content/0.log" Oct 01 14:26:23 crc kubenswrapper[4913]: I1001 14:26:23.846579 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mxbn7_ed62703c-99a3-4c2f-8b04-286e67063932/extract-content/0.log" Oct 01 14:26:24 crc kubenswrapper[4913]: I1001 14:26:24.044870 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mxbn7_ed62703c-99a3-4c2f-8b04-286e67063932/extract-content/0.log" Oct 01 14:26:24 crc kubenswrapper[4913]: I1001 14:26:24.133953 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mxbn7_ed62703c-99a3-4c2f-8b04-286e67063932/extract-utilities/0.log" Oct 01 14:26:24 crc kubenswrapper[4913]: I1001 14:26:24.274559 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tl7mr_1ad95641-9502-4c02-93aa-f77003a85ebb/extract-utilities/0.log" Oct 01 14:26:24 crc kubenswrapper[4913]: I1001 14:26:24.395108 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mxbn7_ed62703c-99a3-4c2f-8b04-286e67063932/registry-server/0.log" Oct 01 14:26:24 crc kubenswrapper[4913]: I1001 14:26:24.525229 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tl7mr_1ad95641-9502-4c02-93aa-f77003a85ebb/extract-utilities/0.log" Oct 01 14:26:24 crc kubenswrapper[4913]: I1001 14:26:24.525724 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tl7mr_1ad95641-9502-4c02-93aa-f77003a85ebb/extract-content/0.log" Oct 01 14:26:24 crc kubenswrapper[4913]: I1001 14:26:24.543122 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tl7mr_1ad95641-9502-4c02-93aa-f77003a85ebb/extract-content/0.log" Oct 01 14:26:24 crc kubenswrapper[4913]: I1001 14:26:24.690427 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tl7mr_1ad95641-9502-4c02-93aa-f77003a85ebb/extract-utilities/0.log" Oct 01 14:26:24 crc kubenswrapper[4913]: I1001 14:26:24.742862 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tl7mr_1ad95641-9502-4c02-93aa-f77003a85ebb/extract-content/0.log" Oct 01 14:26:25 crc kubenswrapper[4913]: I1001 14:26:25.402876 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tl7mr_1ad95641-9502-4c02-93aa-f77003a85ebb/registry-server/0.log" Oct 01 14:26:31 crc kubenswrapper[4913]: I1001 14:26:31.806902 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:26:31 crc kubenswrapper[4913]: E1001 14:26:31.807646 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:26:43 crc kubenswrapper[4913]: I1001 14:26:43.807491 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:26:43 crc kubenswrapper[4913]: E1001 14:26:43.808342 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.359425 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tc6fq"] Oct 01 14:26:46 crc kubenswrapper[4913]: E1001 14:26:46.360306 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e92916-51c7-4c12-a2df-af8ef7b33521" containerName="extract-utilities" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.360325 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e92916-51c7-4c12-a2df-af8ef7b33521" containerName="extract-utilities" Oct 01 14:26:46 crc kubenswrapper[4913]: E1001 14:26:46.360344 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e92916-51c7-4c12-a2df-af8ef7b33521" containerName="extract-content" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.360352 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e92916-51c7-4c12-a2df-af8ef7b33521" containerName="extract-content" Oct 01 14:26:46 crc kubenswrapper[4913]: E1001 14:26:46.360382 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e92916-51c7-4c12-a2df-af8ef7b33521" containerName="registry-server" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.360391 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e92916-51c7-4c12-a2df-af8ef7b33521" containerName="registry-server" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.360650 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e92916-51c7-4c12-a2df-af8ef7b33521" containerName="registry-server" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.362601 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.371486 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tc6fq"] Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.473549 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nqq6\" (UniqueName: \"kubernetes.io/projected/8631a0d8-711e-4548-9e6b-5173366a5485-kube-api-access-6nqq6\") pod \"community-operators-tc6fq\" (UID: \"8631a0d8-711e-4548-9e6b-5173366a5485\") " pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.474014 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8631a0d8-711e-4548-9e6b-5173366a5485-utilities\") pod \"community-operators-tc6fq\" (UID: \"8631a0d8-711e-4548-9e6b-5173366a5485\") " pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.474048 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8631a0d8-711e-4548-9e6b-5173366a5485-catalog-content\") pod \"community-operators-tc6fq\" (UID: \"8631a0d8-711e-4548-9e6b-5173366a5485\") " pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.577339 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nqq6\" (UniqueName: \"kubernetes.io/projected/8631a0d8-711e-4548-9e6b-5173366a5485-kube-api-access-6nqq6\") pod \"community-operators-tc6fq\" (UID: \"8631a0d8-711e-4548-9e6b-5173366a5485\") " pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.577498 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8631a0d8-711e-4548-9e6b-5173366a5485-utilities\") pod \"community-operators-tc6fq\" (UID: \"8631a0d8-711e-4548-9e6b-5173366a5485\") " pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.577537 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8631a0d8-711e-4548-9e6b-5173366a5485-catalog-content\") pod \"community-operators-tc6fq\" (UID: \"8631a0d8-711e-4548-9e6b-5173366a5485\") " pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.578332 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8631a0d8-711e-4548-9e6b-5173366a5485-catalog-content\") pod \"community-operators-tc6fq\" (UID: \"8631a0d8-711e-4548-9e6b-5173366a5485\") " pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.579054 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8631a0d8-711e-4548-9e6b-5173366a5485-utilities\") pod \"community-operators-tc6fq\" (UID: \"8631a0d8-711e-4548-9e6b-5173366a5485\") " pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.607178 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nqq6\" (UniqueName: \"kubernetes.io/projected/8631a0d8-711e-4548-9e6b-5173366a5485-kube-api-access-6nqq6\") pod \"community-operators-tc6fq\" (UID: \"8631a0d8-711e-4548-9e6b-5173366a5485\") " pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:26:46 crc kubenswrapper[4913]: I1001 14:26:46.704937 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:26:47 crc kubenswrapper[4913]: I1001 14:26:47.324637 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tc6fq"] Oct 01 14:26:48 crc kubenswrapper[4913]: I1001 14:26:48.126516 4913 generic.go:334] "Generic (PLEG): container finished" podID="8631a0d8-711e-4548-9e6b-5173366a5485" containerID="c086eb8cfe13ded3dfbae67e2e078f9e354815ddb4039fbe11d33e5414f6f9c5" exitCode=0 Oct 01 14:26:48 crc kubenswrapper[4913]: I1001 14:26:48.126637 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc6fq" event={"ID":"8631a0d8-711e-4548-9e6b-5173366a5485","Type":"ContainerDied","Data":"c086eb8cfe13ded3dfbae67e2e078f9e354815ddb4039fbe11d33e5414f6f9c5"} Oct 01 14:26:48 crc kubenswrapper[4913]: I1001 14:26:48.128082 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc6fq" event={"ID":"8631a0d8-711e-4548-9e6b-5173366a5485","Type":"ContainerStarted","Data":"a03ad0d2bba96c6471e520db9b99fa2c30207b54253bcdfde4d3fcde54294c01"} Oct 01 14:26:51 crc kubenswrapper[4913]: I1001 14:26:51.174930 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc6fq" event={"ID":"8631a0d8-711e-4548-9e6b-5173366a5485","Type":"ContainerStarted","Data":"b1d20c2f17854caca0021ed7b88f79f7ec7218a66d25cebe7160520b259d698c"} Oct 01 14:26:53 crc kubenswrapper[4913]: I1001 14:26:53.194151 4913 generic.go:334] "Generic (PLEG): container finished" podID="8631a0d8-711e-4548-9e6b-5173366a5485" containerID="b1d20c2f17854caca0021ed7b88f79f7ec7218a66d25cebe7160520b259d698c" exitCode=0 Oct 01 14:26:53 crc kubenswrapper[4913]: I1001 14:26:53.194280 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc6fq" event={"ID":"8631a0d8-711e-4548-9e6b-5173366a5485","Type":"ContainerDied","Data":"b1d20c2f17854caca0021ed7b88f79f7ec7218a66d25cebe7160520b259d698c"} Oct 01 14:26:54 crc kubenswrapper[4913]: I1001 14:26:54.204435 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc6fq" event={"ID":"8631a0d8-711e-4548-9e6b-5173366a5485","Type":"ContainerStarted","Data":"353810f50ba45dada7cf19e686ba0bcb780fe315d848c3f1537ee9e9a666a5b8"} Oct 01 14:26:54 crc kubenswrapper[4913]: I1001 14:26:54.230425 4913 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tc6fq" podStartSLOduration=2.495130103 podStartE2EDuration="8.230404903s" podCreationTimestamp="2025-10-01 14:26:46 +0000 UTC" firstStartedPulling="2025-10-01 14:26:48.127893772 +0000 UTC m=+6540.031369350" lastFinishedPulling="2025-10-01 14:26:53.863168572 +0000 UTC m=+6545.766644150" observedRunningTime="2025-10-01 14:26:54.223876702 +0000 UTC m=+6546.127352290" watchObservedRunningTime="2025-10-01 14:26:54.230404903 +0000 UTC m=+6546.133880481" Oct 01 14:26:54 crc kubenswrapper[4913]: I1001 14:26:54.811540 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:26:54 crc kubenswrapper[4913]: E1001 14:26:54.812149 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:26:56 crc kubenswrapper[4913]: I1001 14:26:56.706053 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:26:56 crc kubenswrapper[4913]: I1001 14:26:56.706394 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:26:56 crc kubenswrapper[4913]: I1001 14:26:56.750822 4913 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:27:06 crc kubenswrapper[4913]: I1001 14:27:06.757423 4913 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:27:06 crc kubenswrapper[4913]: I1001 14:27:06.828938 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tc6fq"] Oct 01 14:27:07 crc kubenswrapper[4913]: I1001 14:27:07.317766 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tc6fq" podUID="8631a0d8-711e-4548-9e6b-5173366a5485" containerName="registry-server" containerID="cri-o://353810f50ba45dada7cf19e686ba0bcb780fe315d848c3f1537ee9e9a666a5b8" gracePeriod=2 Oct 01 14:27:07 crc kubenswrapper[4913]: I1001 14:27:07.807477 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:27:07 crc kubenswrapper[4913]: E1001 14:27:07.808038 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:27:07 crc kubenswrapper[4913]: I1001 14:27:07.828885 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:27:07 crc kubenswrapper[4913]: I1001 14:27:07.899721 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nqq6\" (UniqueName: \"kubernetes.io/projected/8631a0d8-711e-4548-9e6b-5173366a5485-kube-api-access-6nqq6\") pod \"8631a0d8-711e-4548-9e6b-5173366a5485\" (UID: \"8631a0d8-711e-4548-9e6b-5173366a5485\") " Oct 01 14:27:07 crc kubenswrapper[4913]: I1001 14:27:07.900089 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8631a0d8-711e-4548-9e6b-5173366a5485-catalog-content\") pod \"8631a0d8-711e-4548-9e6b-5173366a5485\" (UID: \"8631a0d8-711e-4548-9e6b-5173366a5485\") " Oct 01 14:27:07 crc kubenswrapper[4913]: I1001 14:27:07.900206 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8631a0d8-711e-4548-9e6b-5173366a5485-utilities\") pod \"8631a0d8-711e-4548-9e6b-5173366a5485\" (UID: \"8631a0d8-711e-4548-9e6b-5173366a5485\") " Oct 01 14:27:07 crc kubenswrapper[4913]: I1001 14:27:07.900941 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8631a0d8-711e-4548-9e6b-5173366a5485-utilities" (OuterVolumeSpecName: "utilities") pod "8631a0d8-711e-4548-9e6b-5173366a5485" (UID: "8631a0d8-711e-4548-9e6b-5173366a5485"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:27:07 crc kubenswrapper[4913]: I1001 14:27:07.906172 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8631a0d8-711e-4548-9e6b-5173366a5485-kube-api-access-6nqq6" (OuterVolumeSpecName: "kube-api-access-6nqq6") pod "8631a0d8-711e-4548-9e6b-5173366a5485" (UID: "8631a0d8-711e-4548-9e6b-5173366a5485"). InnerVolumeSpecName "kube-api-access-6nqq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:27:07 crc kubenswrapper[4913]: I1001 14:27:07.959086 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8631a0d8-711e-4548-9e6b-5173366a5485-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8631a0d8-711e-4548-9e6b-5173366a5485" (UID: "8631a0d8-711e-4548-9e6b-5173366a5485"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.003693 4913 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8631a0d8-711e-4548-9e6b-5173366a5485-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.003762 4913 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8631a0d8-711e-4548-9e6b-5173366a5485-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.003777 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nqq6\" (UniqueName: \"kubernetes.io/projected/8631a0d8-711e-4548-9e6b-5173366a5485-kube-api-access-6nqq6\") on node \"crc\" DevicePath \"\"" Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.330088 4913 generic.go:334] "Generic (PLEG): container finished" podID="8631a0d8-711e-4548-9e6b-5173366a5485" containerID="353810f50ba45dada7cf19e686ba0bcb780fe315d848c3f1537ee9e9a666a5b8" exitCode=0 Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.330165 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tc6fq" Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.330174 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc6fq" event={"ID":"8631a0d8-711e-4548-9e6b-5173366a5485","Type":"ContainerDied","Data":"353810f50ba45dada7cf19e686ba0bcb780fe315d848c3f1537ee9e9a666a5b8"} Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.330635 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc6fq" event={"ID":"8631a0d8-711e-4548-9e6b-5173366a5485","Type":"ContainerDied","Data":"a03ad0d2bba96c6471e520db9b99fa2c30207b54253bcdfde4d3fcde54294c01"} Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.330659 4913 scope.go:117] "RemoveContainer" containerID="353810f50ba45dada7cf19e686ba0bcb780fe315d848c3f1537ee9e9a666a5b8" Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.353257 4913 scope.go:117] "RemoveContainer" containerID="b1d20c2f17854caca0021ed7b88f79f7ec7218a66d25cebe7160520b259d698c" Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.391146 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tc6fq"] Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.396467 4913 scope.go:117] "RemoveContainer" containerID="c086eb8cfe13ded3dfbae67e2e078f9e354815ddb4039fbe11d33e5414f6f9c5" Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.402403 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tc6fq"] Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.443722 4913 scope.go:117] "RemoveContainer" containerID="353810f50ba45dada7cf19e686ba0bcb780fe315d848c3f1537ee9e9a666a5b8" Oct 01 14:27:08 crc kubenswrapper[4913]: E1001 14:27:08.444315 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"353810f50ba45dada7cf19e686ba0bcb780fe315d848c3f1537ee9e9a666a5b8\": container with ID starting with 353810f50ba45dada7cf19e686ba0bcb780fe315d848c3f1537ee9e9a666a5b8 not found: ID does not exist" containerID="353810f50ba45dada7cf19e686ba0bcb780fe315d848c3f1537ee9e9a666a5b8" Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.444361 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"353810f50ba45dada7cf19e686ba0bcb780fe315d848c3f1537ee9e9a666a5b8"} err="failed to get container status \"353810f50ba45dada7cf19e686ba0bcb780fe315d848c3f1537ee9e9a666a5b8\": rpc error: code = NotFound desc = could not find container \"353810f50ba45dada7cf19e686ba0bcb780fe315d848c3f1537ee9e9a666a5b8\": container with ID starting with 353810f50ba45dada7cf19e686ba0bcb780fe315d848c3f1537ee9e9a666a5b8 not found: ID does not exist" Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.444388 4913 scope.go:117] "RemoveContainer" containerID="b1d20c2f17854caca0021ed7b88f79f7ec7218a66d25cebe7160520b259d698c" Oct 01 14:27:08 crc kubenswrapper[4913]: E1001 14:27:08.444893 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d20c2f17854caca0021ed7b88f79f7ec7218a66d25cebe7160520b259d698c\": container with ID starting with b1d20c2f17854caca0021ed7b88f79f7ec7218a66d25cebe7160520b259d698c not found: ID does not exist" containerID="b1d20c2f17854caca0021ed7b88f79f7ec7218a66d25cebe7160520b259d698c" Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.444954 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d20c2f17854caca0021ed7b88f79f7ec7218a66d25cebe7160520b259d698c"} err="failed to get container status \"b1d20c2f17854caca0021ed7b88f79f7ec7218a66d25cebe7160520b259d698c\": rpc error: code = NotFound desc = could not find container \"b1d20c2f17854caca0021ed7b88f79f7ec7218a66d25cebe7160520b259d698c\": container with ID starting with b1d20c2f17854caca0021ed7b88f79f7ec7218a66d25cebe7160520b259d698c not found: ID does not exist" Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.444990 4913 scope.go:117] "RemoveContainer" containerID="c086eb8cfe13ded3dfbae67e2e078f9e354815ddb4039fbe11d33e5414f6f9c5" Oct 01 14:27:08 crc kubenswrapper[4913]: E1001 14:27:08.445299 4913 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c086eb8cfe13ded3dfbae67e2e078f9e354815ddb4039fbe11d33e5414f6f9c5\": container with ID starting with c086eb8cfe13ded3dfbae67e2e078f9e354815ddb4039fbe11d33e5414f6f9c5 not found: ID does not exist" containerID="c086eb8cfe13ded3dfbae67e2e078f9e354815ddb4039fbe11d33e5414f6f9c5" Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.445325 4913 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c086eb8cfe13ded3dfbae67e2e078f9e354815ddb4039fbe11d33e5414f6f9c5"} err="failed to get container status \"c086eb8cfe13ded3dfbae67e2e078f9e354815ddb4039fbe11d33e5414f6f9c5\": rpc error: code = NotFound desc = could not find container \"c086eb8cfe13ded3dfbae67e2e078f9e354815ddb4039fbe11d33e5414f6f9c5\": container with ID starting with c086eb8cfe13ded3dfbae67e2e078f9e354815ddb4039fbe11d33e5414f6f9c5 not found: ID does not exist" Oct 01 14:27:08 crc kubenswrapper[4913]: E1001 14:27:08.473617 4913 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8631a0d8_711e_4548_9e6b_5173366a5485.slice\": RecentStats: unable to find data in memory cache]" Oct 01 14:27:08 crc kubenswrapper[4913]: E1001 14:27:08.813757 4913 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 01 14:27:08 crc kubenswrapper[4913]: I1001 14:27:08.817770 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8631a0d8-711e-4548-9e6b-5173366a5485" path="/var/lib/kubelet/pods/8631a0d8-711e-4548-9e6b-5173366a5485/volumes" Oct 01 14:27:20 crc kubenswrapper[4913]: I1001 14:27:20.806443 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:27:20 crc kubenswrapper[4913]: E1001 14:27:20.807427 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:27:34 crc kubenswrapper[4913]: I1001 14:27:34.806520 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:27:34 crc kubenswrapper[4913]: E1001 14:27:34.807287 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:27:45 crc kubenswrapper[4913]: I1001 14:27:45.806881 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:27:45 crc kubenswrapper[4913]: E1001 14:27:45.807651 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:27:56 crc kubenswrapper[4913]: I1001 14:27:56.807231 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:27:56 crc kubenswrapper[4913]: E1001 14:27:56.808057 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:28:09 crc kubenswrapper[4913]: I1001 14:28:09.806160 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:28:09 crc kubenswrapper[4913]: E1001 14:28:09.807147 4913 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8hltg_openshift-machine-config-operator(e8903e6e-381f-4f5c-b9c5-5242c3de2897)\"" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" Oct 01 14:28:11 crc kubenswrapper[4913]: E1001 14:28:11.806930 4913 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 01 14:28:22 crc kubenswrapper[4913]: I1001 14:28:22.811946 4913 scope.go:117] "RemoveContainer" containerID="be786056488b283e3423027cb249f3cba5edf64ec07f3ec894f31474d512bcb6" Oct 01 14:28:24 crc kubenswrapper[4913]: I1001 14:28:24.011378 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" event={"ID":"e8903e6e-381f-4f5c-b9c5-5242c3de2897","Type":"ContainerStarted","Data":"fb9e72fe42ce218b4912efa643c32160195cafc140120613ceb5f85cda4349d2"} Oct 01 14:28:45 crc kubenswrapper[4913]: I1001 14:28:45.243593 4913 generic.go:334] "Generic (PLEG): container finished" podID="833aa627-cc45-4ac9-b59b-9b0116867977" containerID="76d46fec2319fad77761fe3136cbd7358fe4d94031ec542d7b9db1748e45062f" exitCode=0 Oct 01 14:28:45 crc kubenswrapper[4913]: I1001 14:28:45.243733 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fzk86/must-gather-9ddn5" event={"ID":"833aa627-cc45-4ac9-b59b-9b0116867977","Type":"ContainerDied","Data":"76d46fec2319fad77761fe3136cbd7358fe4d94031ec542d7b9db1748e45062f"} Oct 01 14:28:45 crc kubenswrapper[4913]: I1001 14:28:45.244685 4913 scope.go:117] "RemoveContainer" containerID="76d46fec2319fad77761fe3136cbd7358fe4d94031ec542d7b9db1748e45062f" Oct 01 14:28:45 crc kubenswrapper[4913]: I1001 14:28:45.911436 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fzk86_must-gather-9ddn5_833aa627-cc45-4ac9-b59b-9b0116867977/gather/0.log" Oct 01 14:28:54 crc kubenswrapper[4913]: I1001 14:28:54.895092 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fzk86/must-gather-9ddn5"] Oct 01 14:28:54 crc kubenswrapper[4913]: I1001 14:28:54.906058 4913 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fzk86/must-gather-9ddn5" podUID="833aa627-cc45-4ac9-b59b-9b0116867977" containerName="copy" containerID="cri-o://f609ef08b1179a78e9facd28eab4cfd260c1cfaa292cd6dd984b5ecd9f8939dd" gracePeriod=2 Oct 01 14:28:54 crc kubenswrapper[4913]: I1001 14:28:54.943037 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fzk86/must-gather-9ddn5"] Oct 01 14:28:55 crc kubenswrapper[4913]: I1001 14:28:55.333725 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fzk86_must-gather-9ddn5_833aa627-cc45-4ac9-b59b-9b0116867977/copy/0.log" Oct 01 14:28:55 crc kubenswrapper[4913]: I1001 14:28:55.334606 4913 generic.go:334] "Generic (PLEG): container finished" podID="833aa627-cc45-4ac9-b59b-9b0116867977" containerID="f609ef08b1179a78e9facd28eab4cfd260c1cfaa292cd6dd984b5ecd9f8939dd" exitCode=143 Oct 01 14:28:55 crc kubenswrapper[4913]: I1001 14:28:55.334658 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b791867aa270a3fbcd952ea9dce7864dd22b5ad57bf49e96246b357df051b6b2" Oct 01 14:28:55 crc kubenswrapper[4913]: I1001 14:28:55.339983 4913 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fzk86_must-gather-9ddn5_833aa627-cc45-4ac9-b59b-9b0116867977/copy/0.log" Oct 01 14:28:55 crc kubenswrapper[4913]: I1001 14:28:55.340582 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/must-gather-9ddn5" Oct 01 14:28:55 crc kubenswrapper[4913]: I1001 14:28:55.444086 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/833aa627-cc45-4ac9-b59b-9b0116867977-must-gather-output\") pod \"833aa627-cc45-4ac9-b59b-9b0116867977\" (UID: \"833aa627-cc45-4ac9-b59b-9b0116867977\") " Oct 01 14:28:55 crc kubenswrapper[4913]: I1001 14:28:55.444185 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4rck\" (UniqueName: \"kubernetes.io/projected/833aa627-cc45-4ac9-b59b-9b0116867977-kube-api-access-v4rck\") pod \"833aa627-cc45-4ac9-b59b-9b0116867977\" (UID: \"833aa627-cc45-4ac9-b59b-9b0116867977\") " Oct 01 14:28:55 crc kubenswrapper[4913]: I1001 14:28:55.450076 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/833aa627-cc45-4ac9-b59b-9b0116867977-kube-api-access-v4rck" (OuterVolumeSpecName: "kube-api-access-v4rck") pod "833aa627-cc45-4ac9-b59b-9b0116867977" (UID: "833aa627-cc45-4ac9-b59b-9b0116867977"). InnerVolumeSpecName "kube-api-access-v4rck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:28:55 crc kubenswrapper[4913]: I1001 14:28:55.547587 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4rck\" (UniqueName: \"kubernetes.io/projected/833aa627-cc45-4ac9-b59b-9b0116867977-kube-api-access-v4rck\") on node \"crc\" DevicePath \"\"" Oct 01 14:28:55 crc kubenswrapper[4913]: I1001 14:28:55.612615 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/833aa627-cc45-4ac9-b59b-9b0116867977-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "833aa627-cc45-4ac9-b59b-9b0116867977" (UID: "833aa627-cc45-4ac9-b59b-9b0116867977"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:28:55 crc kubenswrapper[4913]: I1001 14:28:55.649685 4913 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/833aa627-cc45-4ac9-b59b-9b0116867977-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 14:28:56 crc kubenswrapper[4913]: I1001 14:28:56.342688 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fzk86/must-gather-9ddn5" Oct 01 14:28:56 crc kubenswrapper[4913]: I1001 14:28:56.822003 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="833aa627-cc45-4ac9-b59b-9b0116867977" path="/var/lib/kubelet/pods/833aa627-cc45-4ac9-b59b-9b0116867977/volumes" Oct 01 14:29:09 crc kubenswrapper[4913]: I1001 14:29:09.430302 4913 scope.go:117] "RemoveContainer" containerID="f609ef08b1179a78e9facd28eab4cfd260c1cfaa292cd6dd984b5ecd9f8939dd" Oct 01 14:29:09 crc kubenswrapper[4913]: I1001 14:29:09.454734 4913 scope.go:117] "RemoveContainer" containerID="76d46fec2319fad77761fe3136cbd7358fe4d94031ec542d7b9db1748e45062f" Oct 01 14:29:09 crc kubenswrapper[4913]: I1001 14:29:09.500663 4913 scope.go:117] "RemoveContainer" containerID="29afacbd7cc4b8bd8afffbae54b6be20fdbdc65fa7889c68178721d41f39c2a9" Oct 01 14:29:38 crc kubenswrapper[4913]: E1001 14:29:38.815024 4913 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.214716 4913 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr"] Oct 01 14:30:00 crc kubenswrapper[4913]: E1001 14:30:00.217112 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8631a0d8-711e-4548-9e6b-5173366a5485" containerName="registry-server" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.217176 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8631a0d8-711e-4548-9e6b-5173366a5485" containerName="registry-server" Oct 01 14:30:00 crc kubenswrapper[4913]: E1001 14:30:00.217319 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833aa627-cc45-4ac9-b59b-9b0116867977" containerName="gather" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.217337 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="833aa627-cc45-4ac9-b59b-9b0116867977" containerName="gather" Oct 01 14:30:00 crc kubenswrapper[4913]: E1001 14:30:00.217406 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8631a0d8-711e-4548-9e6b-5173366a5485" containerName="extract-utilities" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.217417 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8631a0d8-711e-4548-9e6b-5173366a5485" containerName="extract-utilities" Oct 01 14:30:00 crc kubenswrapper[4913]: E1001 14:30:00.217480 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8631a0d8-711e-4548-9e6b-5173366a5485" containerName="extract-content" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.217491 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="8631a0d8-711e-4548-9e6b-5173366a5485" containerName="extract-content" Oct 01 14:30:00 crc kubenswrapper[4913]: E1001 14:30:00.217550 4913 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833aa627-cc45-4ac9-b59b-9b0116867977" containerName="copy" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.217561 4913 state_mem.go:107] "Deleted CPUSet assignment" podUID="833aa627-cc45-4ac9-b59b-9b0116867977" containerName="copy" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.218194 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="8631a0d8-711e-4548-9e6b-5173366a5485" containerName="registry-server" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.218229 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="833aa627-cc45-4ac9-b59b-9b0116867977" containerName="gather" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.218258 4913 memory_manager.go:354] "RemoveStaleState removing state" podUID="833aa627-cc45-4ac9-b59b-9b0116867977" containerName="copy" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.220333 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.229801 4913 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.230247 4913 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.233151 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr"] Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.299133 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69a2585f-6552-438e-a7af-044ac7ca9996-config-volume\") pod \"collect-profiles-29322150-bgdpr\" (UID: \"69a2585f-6552-438e-a7af-044ac7ca9996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.299371 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69a2585f-6552-438e-a7af-044ac7ca9996-secret-volume\") pod \"collect-profiles-29322150-bgdpr\" (UID: \"69a2585f-6552-438e-a7af-044ac7ca9996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.299460 4913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkg7q\" (UniqueName: \"kubernetes.io/projected/69a2585f-6552-438e-a7af-044ac7ca9996-kube-api-access-rkg7q\") pod \"collect-profiles-29322150-bgdpr\" (UID: \"69a2585f-6552-438e-a7af-044ac7ca9996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.402045 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69a2585f-6552-438e-a7af-044ac7ca9996-secret-volume\") pod \"collect-profiles-29322150-bgdpr\" (UID: \"69a2585f-6552-438e-a7af-044ac7ca9996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.402157 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkg7q\" (UniqueName: \"kubernetes.io/projected/69a2585f-6552-438e-a7af-044ac7ca9996-kube-api-access-rkg7q\") pod \"collect-profiles-29322150-bgdpr\" (UID: \"69a2585f-6552-438e-a7af-044ac7ca9996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.402446 4913 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69a2585f-6552-438e-a7af-044ac7ca9996-config-volume\") pod \"collect-profiles-29322150-bgdpr\" (UID: \"69a2585f-6552-438e-a7af-044ac7ca9996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.403955 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69a2585f-6552-438e-a7af-044ac7ca9996-config-volume\") pod \"collect-profiles-29322150-bgdpr\" (UID: \"69a2585f-6552-438e-a7af-044ac7ca9996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.408926 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69a2585f-6552-438e-a7af-044ac7ca9996-secret-volume\") pod \"collect-profiles-29322150-bgdpr\" (UID: \"69a2585f-6552-438e-a7af-044ac7ca9996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.419069 4913 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkg7q\" (UniqueName: \"kubernetes.io/projected/69a2585f-6552-438e-a7af-044ac7ca9996-kube-api-access-rkg7q\") pod \"collect-profiles-29322150-bgdpr\" (UID: \"69a2585f-6552-438e-a7af-044ac7ca9996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" Oct 01 14:30:00 crc kubenswrapper[4913]: I1001 14:30:00.554419 4913 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" Oct 01 14:30:01 crc kubenswrapper[4913]: I1001 14:30:01.050486 4913 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr"] Oct 01 14:30:01 crc kubenswrapper[4913]: I1001 14:30:01.971540 4913 generic.go:334] "Generic (PLEG): container finished" podID="69a2585f-6552-438e-a7af-044ac7ca9996" containerID="50ae7ef0d511b3efd1e0ffc1563d286f4688c357f9dc3d1518dd4d9bededc0f7" exitCode=0 Oct 01 14:30:01 crc kubenswrapper[4913]: I1001 14:30:01.971647 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" event={"ID":"69a2585f-6552-438e-a7af-044ac7ca9996","Type":"ContainerDied","Data":"50ae7ef0d511b3efd1e0ffc1563d286f4688c357f9dc3d1518dd4d9bededc0f7"} Oct 01 14:30:01 crc kubenswrapper[4913]: I1001 14:30:01.972103 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" event={"ID":"69a2585f-6552-438e-a7af-044ac7ca9996","Type":"ContainerStarted","Data":"3eeed026a1231b15206b11da02ee4d39d4b5ddd232414020eacfd2563536692c"} Oct 01 14:30:03 crc kubenswrapper[4913]: I1001 14:30:03.419294 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" Oct 01 14:30:03 crc kubenswrapper[4913]: I1001 14:30:03.577850 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69a2585f-6552-438e-a7af-044ac7ca9996-secret-volume\") pod \"69a2585f-6552-438e-a7af-044ac7ca9996\" (UID: \"69a2585f-6552-438e-a7af-044ac7ca9996\") " Oct 01 14:30:03 crc kubenswrapper[4913]: I1001 14:30:03.577948 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69a2585f-6552-438e-a7af-044ac7ca9996-config-volume\") pod \"69a2585f-6552-438e-a7af-044ac7ca9996\" (UID: \"69a2585f-6552-438e-a7af-044ac7ca9996\") " Oct 01 14:30:03 crc kubenswrapper[4913]: I1001 14:30:03.578039 4913 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkg7q\" (UniqueName: \"kubernetes.io/projected/69a2585f-6552-438e-a7af-044ac7ca9996-kube-api-access-rkg7q\") pod \"69a2585f-6552-438e-a7af-044ac7ca9996\" (UID: \"69a2585f-6552-438e-a7af-044ac7ca9996\") " Oct 01 14:30:03 crc kubenswrapper[4913]: I1001 14:30:03.579348 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a2585f-6552-438e-a7af-044ac7ca9996-config-volume" (OuterVolumeSpecName: "config-volume") pod "69a2585f-6552-438e-a7af-044ac7ca9996" (UID: "69a2585f-6552-438e-a7af-044ac7ca9996"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:30:03 crc kubenswrapper[4913]: I1001 14:30:03.580516 4913 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69a2585f-6552-438e-a7af-044ac7ca9996-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:30:03 crc kubenswrapper[4913]: I1001 14:30:03.597433 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a2585f-6552-438e-a7af-044ac7ca9996-kube-api-access-rkg7q" (OuterVolumeSpecName: "kube-api-access-rkg7q") pod "69a2585f-6552-438e-a7af-044ac7ca9996" (UID: "69a2585f-6552-438e-a7af-044ac7ca9996"). InnerVolumeSpecName "kube-api-access-rkg7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:30:03 crc kubenswrapper[4913]: I1001 14:30:03.598807 4913 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a2585f-6552-438e-a7af-044ac7ca9996-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "69a2585f-6552-438e-a7af-044ac7ca9996" (UID: "69a2585f-6552-438e-a7af-044ac7ca9996"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:30:03 crc kubenswrapper[4913]: I1001 14:30:03.682788 4913 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkg7q\" (UniqueName: \"kubernetes.io/projected/69a2585f-6552-438e-a7af-044ac7ca9996-kube-api-access-rkg7q\") on node \"crc\" DevicePath \"\"" Oct 01 14:30:03 crc kubenswrapper[4913]: I1001 14:30:03.682822 4913 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69a2585f-6552-438e-a7af-044ac7ca9996-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:30:03 crc kubenswrapper[4913]: I1001 14:30:03.993995 4913 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" event={"ID":"69a2585f-6552-438e-a7af-044ac7ca9996","Type":"ContainerDied","Data":"3eeed026a1231b15206b11da02ee4d39d4b5ddd232414020eacfd2563536692c"} Oct 01 14:30:03 crc kubenswrapper[4913]: I1001 14:30:03.994046 4913 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-bgdpr" Oct 01 14:30:03 crc kubenswrapper[4913]: I1001 14:30:03.994058 4913 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eeed026a1231b15206b11da02ee4d39d4b5ddd232414020eacfd2563536692c" Oct 01 14:30:04 crc kubenswrapper[4913]: I1001 14:30:04.492156 4913 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn"] Oct 01 14:30:04 crc kubenswrapper[4913]: I1001 14:30:04.499166 4913 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-lhbtn"] Oct 01 14:30:04 crc kubenswrapper[4913]: I1001 14:30:04.818177 4913 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d30dfe-c157-4393-8a07-60ba3fc50e49" path="/var/lib/kubelet/pods/c2d30dfe-c157-4393-8a07-60ba3fc50e49/volumes" Oct 01 14:30:09 crc kubenswrapper[4913]: I1001 14:30:09.559221 4913 scope.go:117] "RemoveContainer" containerID="acd7a98cd77b56d363d3d820147fbd1eb6265cb68de9a2409b7d9bff21658c5d" Oct 01 14:30:40 crc kubenswrapper[4913]: I1001 14:30:40.083440 4913 patch_prober.go:28] interesting pod/machine-config-daemon-8hltg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:30:40 crc kubenswrapper[4913]: I1001 14:30:40.084049 4913 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8hltg" podUID="e8903e6e-381f-4f5c-b9c5-5242c3de2897" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"